Nov 24 13:42:13 localhost kernel: Linux version 5.14.0-639.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Sat Nov 15 10:30:41 UTC 2025
Nov 24 13:42:13 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 24 13:42:13 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 13:42:13 localhost kernel: BIOS-provided physical RAM map:
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 24 13:42:13 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 24 13:42:13 localhost kernel: NX (Execute Disable) protection: active
Nov 24 13:42:13 localhost kernel: APIC: Static calls initialized
Nov 24 13:42:13 localhost kernel: SMBIOS 2.8 present.
Nov 24 13:42:13 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 24 13:42:13 localhost kernel: Hypervisor detected: KVM
Nov 24 13:42:13 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 24 13:42:13 localhost kernel: kvm-clock: using sched offset of 5443605714 cycles
Nov 24 13:42:13 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 24 13:42:13 localhost kernel: tsc: Detected 2799.998 MHz processor
Nov 24 13:42:13 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Nov 24 13:42:13 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Nov 24 13:42:13 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 24 13:42:13 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 24 13:42:13 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 24 13:42:13 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 24 13:42:13 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 24 13:42:13 localhost kernel: Using GB pages for direct mapping
Nov 24 13:42:13 localhost kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 24 13:42:13 localhost kernel: ACPI: Early table checksum verification disabled
Nov 24 13:42:13 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 24 13:42:13 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 13:42:13 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 13:42:13 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 13:42:13 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 24 13:42:13 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 13:42:13 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 24 13:42:13 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 24 13:42:13 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 24 13:42:13 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 24 13:42:13 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 24 13:42:13 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 24 13:42:13 localhost kernel: No NUMA configuration found
Nov 24 13:42:13 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 24 13:42:13 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Nov 24 13:42:13 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 24 13:42:13 localhost kernel: Zone ranges:
Nov 24 13:42:13 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 24 13:42:13 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 24 13:42:13 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 24 13:42:13 localhost kernel:   Device   empty
Nov 24 13:42:13 localhost kernel: Movable zone start for each node
Nov 24 13:42:13 localhost kernel: Early memory node ranges
Nov 24 13:42:13 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 24 13:42:13 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 24 13:42:13 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 24 13:42:13 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 24 13:42:13 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 24 13:42:13 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 24 13:42:13 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 24 13:42:13 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Nov 24 13:42:13 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 24 13:42:13 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 24 13:42:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 24 13:42:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 24 13:42:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 24 13:42:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 24 13:42:13 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 24 13:42:13 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 24 13:42:13 localhost kernel: TSC deadline timer available
Nov 24 13:42:13 localhost kernel: CPU topo: Max. logical packages:   8
Nov 24 13:42:13 localhost kernel: CPU topo: Max. logical dies:       8
Nov 24 13:42:13 localhost kernel: CPU topo: Max. dies per package:   1
Nov 24 13:42:13 localhost kernel: CPU topo: Max. threads per core:   1
Nov 24 13:42:13 localhost kernel: CPU topo: Num. cores per package:     1
Nov 24 13:42:13 localhost kernel: CPU topo: Num. threads per package:   1
Nov 24 13:42:13 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 24 13:42:13 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 24 13:42:13 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 24 13:42:13 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 24 13:42:13 localhost kernel: Booting paravirtualized kernel on KVM
Nov 24 13:42:13 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 24 13:42:13 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 24 13:42:13 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 24 13:42:13 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Nov 24 13:42:13 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Nov 24 13:42:13 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 24 13:42:13 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 13:42:13 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64", will be passed to user space.
Nov 24 13:42:13 localhost kernel: random: crng init done
Nov 24 13:42:13 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 24 13:42:13 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 24 13:42:13 localhost kernel: Fallback order for Node 0: 0 
Nov 24 13:42:13 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 24 13:42:13 localhost kernel: Policy zone: Normal
Nov 24 13:42:13 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 24 13:42:13 localhost kernel: software IO TLB: area num 8.
Nov 24 13:42:13 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 24 13:42:13 localhost kernel: ftrace: allocating 49298 entries in 193 pages
Nov 24 13:42:13 localhost kernel: ftrace: allocated 193 pages with 3 groups
Nov 24 13:42:13 localhost kernel: Dynamic Preempt: voluntary
Nov 24 13:42:13 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 24 13:42:13 localhost kernel: rcu:         RCU event tracing is enabled.
Nov 24 13:42:13 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 24 13:42:13 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Nov 24 13:42:13 localhost kernel:         Rude variant of Tasks RCU enabled.
Nov 24 13:42:13 localhost kernel:         Tracing variant of Tasks RCU enabled.
Nov 24 13:42:13 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 24 13:42:13 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 24 13:42:13 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 13:42:13 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 13:42:13 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 24 13:42:13 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 24 13:42:13 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 24 13:42:13 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 24 13:42:13 localhost kernel: Console: colour VGA+ 80x25
Nov 24 13:42:13 localhost kernel: printk: console [ttyS0] enabled
Nov 24 13:42:13 localhost kernel: ACPI: Core revision 20230331
Nov 24 13:42:13 localhost kernel: APIC: Switch to symmetric I/O mode setup
Nov 24 13:42:13 localhost kernel: x2apic enabled
Nov 24 13:42:13 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Nov 24 13:42:13 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 24 13:42:13 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 24 13:42:13 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 24 13:42:13 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 24 13:42:13 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 24 13:42:13 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 24 13:42:13 localhost kernel: Spectre V2 : Mitigation: Retpolines
Nov 24 13:42:13 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 24 13:42:13 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 24 13:42:13 localhost kernel: RETBleed: Mitigation: untrained return thunk
Nov 24 13:42:13 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 24 13:42:13 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 24 13:42:13 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 24 13:42:13 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 24 13:42:13 localhost kernel: x86/bugs: return thunk changed
Nov 24 13:42:13 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 24 13:42:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 24 13:42:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 24 13:42:13 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 24 13:42:13 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 24 13:42:13 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 24 13:42:13 localhost kernel: Freeing SMP alternatives memory: 40K
Nov 24 13:42:13 localhost kernel: pid_max: default: 32768 minimum: 301
Nov 24 13:42:13 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 24 13:42:13 localhost kernel: landlock: Up and running.
Nov 24 13:42:13 localhost kernel: Yama: becoming mindful.
Nov 24 13:42:13 localhost kernel: SELinux:  Initializing.
Nov 24 13:42:13 localhost kernel: LSM support for eBPF active
Nov 24 13:42:13 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 24 13:42:13 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 24 13:42:13 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 24 13:42:13 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 24 13:42:13 localhost kernel: ... version:                0
Nov 24 13:42:13 localhost kernel: ... bit width:              48
Nov 24 13:42:13 localhost kernel: ... generic registers:      6
Nov 24 13:42:13 localhost kernel: ... value mask:             0000ffffffffffff
Nov 24 13:42:13 localhost kernel: ... max period:             00007fffffffffff
Nov 24 13:42:13 localhost kernel: ... fixed-purpose events:   0
Nov 24 13:42:13 localhost kernel: ... event mask:             000000000000003f
Nov 24 13:42:13 localhost kernel: signal: max sigframe size: 1776
Nov 24 13:42:13 localhost kernel: rcu: Hierarchical SRCU implementation.
Nov 24 13:42:13 localhost kernel: rcu:         Max phase no-delay instances is 400.
Nov 24 13:42:13 localhost kernel: smp: Bringing up secondary CPUs ...
Nov 24 13:42:13 localhost kernel: smpboot: x86: Booting SMP configuration:
Nov 24 13:42:13 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 24 13:42:13 localhost kernel: smp: Brought up 1 node, 8 CPUs
Nov 24 13:42:13 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 24 13:42:13 localhost kernel: node 0 deferred pages initialised in 10ms
Nov 24 13:42:13 localhost kernel: Memory: 7765956K/8388068K available (16384K kernel code, 5786K rwdata, 13900K rodata, 4188K init, 7176K bss, 616264K reserved, 0K cma-reserved)
Nov 24 13:42:13 localhost kernel: devtmpfs: initialized
Nov 24 13:42:13 localhost kernel: x86/mm: Memory block size: 128MB
Nov 24 13:42:13 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 24 13:42:13 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 24 13:42:13 localhost kernel: pinctrl core: initialized pinctrl subsystem
Nov 24 13:42:13 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 24 13:42:13 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 24 13:42:13 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 24 13:42:13 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 24 13:42:13 localhost kernel: audit: initializing netlink subsys (disabled)
Nov 24 13:42:13 localhost kernel: audit: type=2000 audit(1763991731.631:1): state=initialized audit_enabled=0 res=1
Nov 24 13:42:13 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 24 13:42:13 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 24 13:42:13 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 24 13:42:13 localhost kernel: cpuidle: using governor menu
Nov 24 13:42:13 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 24 13:42:13 localhost kernel: PCI: Using configuration type 1 for base access
Nov 24 13:42:13 localhost kernel: PCI: Using configuration type 1 for extended access
Nov 24 13:42:13 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 24 13:42:13 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 24 13:42:13 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 24 13:42:13 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 24 13:42:13 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 24 13:42:13 localhost kernel: Demotion targets for Node 0: null
Nov 24 13:42:13 localhost kernel: cryptd: max_cpu_qlen set to 1000
Nov 24 13:42:13 localhost kernel: ACPI: Added _OSI(Module Device)
Nov 24 13:42:13 localhost kernel: ACPI: Added _OSI(Processor Device)
Nov 24 13:42:13 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 24 13:42:13 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 24 13:42:13 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 24 13:42:13 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 24 13:42:13 localhost kernel: ACPI: Interpreter enabled
Nov 24 13:42:13 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 24 13:42:13 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Nov 24 13:42:13 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 24 13:42:13 localhost kernel: PCI: Using E820 reservations for host bridge windows
Nov 24 13:42:13 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 24 13:42:13 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 24 13:42:13 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [3] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [4] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [5] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [6] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [7] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [8] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [9] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [10] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [11] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [12] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [13] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [14] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [15] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [16] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [17] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [18] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [19] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [20] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [21] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [22] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [23] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [24] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [25] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [26] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [27] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [28] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [29] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [30] registered
Nov 24 13:42:13 localhost kernel: acpiphp: Slot [31] registered
Nov 24 13:42:13 localhost kernel: PCI host bridge to bus 0000:00
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 24 13:42:13 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 24 13:42:13 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 24 13:42:13 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 24 13:42:13 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 24 13:42:13 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 24 13:42:13 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 24 13:42:13 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 24 13:42:13 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 24 13:42:13 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 24 13:42:13 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 24 13:42:13 localhost kernel: iommu: Default domain type: Translated
Nov 24 13:42:13 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 24 13:42:13 localhost kernel: SCSI subsystem initialized
Nov 24 13:42:13 localhost kernel: ACPI: bus type USB registered
Nov 24 13:42:13 localhost kernel: usbcore: registered new interface driver usbfs
Nov 24 13:42:13 localhost kernel: usbcore: registered new interface driver hub
Nov 24 13:42:13 localhost kernel: usbcore: registered new device driver usb
Nov 24 13:42:13 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 24 13:42:13 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 24 13:42:13 localhost kernel: PTP clock support registered
Nov 24 13:42:13 localhost kernel: EDAC MC: Ver: 3.0.0
Nov 24 13:42:13 localhost kernel: NetLabel: Initializing
Nov 24 13:42:13 localhost kernel: NetLabel:  domain hash size = 128
Nov 24 13:42:13 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 24 13:42:13 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Nov 24 13:42:13 localhost kernel: PCI: Using ACPI for IRQ routing
Nov 24 13:42:13 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Nov 24 13:42:13 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Nov 24 13:42:13 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 24 13:42:13 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 24 13:42:13 localhost kernel: vgaarb: loaded
Nov 24 13:42:13 localhost kernel: clocksource: Switched to clocksource kvm-clock
Nov 24 13:42:13 localhost kernel: VFS: Disk quotas dquot_6.6.0
Nov 24 13:42:13 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 24 13:42:13 localhost kernel: pnp: PnP ACPI init
Nov 24 13:42:13 localhost kernel: pnp 00:03: [dma 2]
Nov 24 13:42:13 localhost kernel: pnp: PnP ACPI: found 5 devices
Nov 24 13:42:13 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 24 13:42:13 localhost kernel: NET: Registered PF_INET protocol family
Nov 24 13:42:13 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 24 13:42:13 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 24 13:42:13 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 24 13:42:13 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 24 13:42:13 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 24 13:42:13 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 24 13:42:13 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 24 13:42:13 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 24 13:42:13 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 24 13:42:13 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 24 13:42:13 localhost kernel: NET: Registered PF_XDP protocol family
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 24 13:42:13 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 24 13:42:13 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 24 13:42:13 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 24 13:42:13 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 109046 usecs
Nov 24 13:42:13 localhost kernel: PCI: CLS 0 bytes, default 64
Nov 24 13:42:13 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 24 13:42:13 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 24 13:42:13 localhost kernel: ACPI: bus type thunderbolt registered
Nov 24 13:42:13 localhost kernel: Trying to unpack rootfs image as initramfs...
Nov 24 13:42:13 localhost kernel: Initialise system trusted keyrings
Nov 24 13:42:13 localhost kernel: Key type blacklist registered
Nov 24 13:42:13 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 24 13:42:13 localhost kernel: zbud: loaded
Nov 24 13:42:13 localhost kernel: integrity: Platform Keyring initialized
Nov 24 13:42:13 localhost kernel: integrity: Machine keyring initialized
Nov 24 13:42:13 localhost kernel: Freeing initrd memory: 85868K
Nov 24 13:42:13 localhost kernel: NET: Registered PF_ALG protocol family
Nov 24 13:42:13 localhost kernel: xor: automatically using best checksumming function   avx       
Nov 24 13:42:13 localhost kernel: Key type asymmetric registered
Nov 24 13:42:13 localhost kernel: Asymmetric key parser 'x509' registered
Nov 24 13:42:13 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 24 13:42:13 localhost kernel: io scheduler mq-deadline registered
Nov 24 13:42:13 localhost kernel: io scheduler kyber registered
Nov 24 13:42:13 localhost kernel: io scheduler bfq registered
Nov 24 13:42:13 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 24 13:42:13 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 24 13:42:13 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 24 13:42:13 localhost kernel: ACPI: button: Power Button [PWRF]
Nov 24 13:42:13 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 24 13:42:13 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 24 13:42:13 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 24 13:42:13 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 24 13:42:13 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 24 13:42:13 localhost kernel: Non-volatile memory driver v1.3
Nov 24 13:42:13 localhost kernel: rdac: device handler registered
Nov 24 13:42:13 localhost kernel: hp_sw: device handler registered
Nov 24 13:42:13 localhost kernel: emc: device handler registered
Nov 24 13:42:13 localhost kernel: alua: device handler registered
Nov 24 13:42:13 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 24 13:42:13 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 24 13:42:13 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 24 13:42:13 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 24 13:42:13 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 24 13:42:13 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 24 13:42:13 localhost kernel: usb usb1: Product: UHCI Host Controller
Nov 24 13:42:13 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-639.el9.x86_64 uhci_hcd
Nov 24 13:42:13 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 24 13:42:13 localhost kernel: hub 1-0:1.0: USB hub found
Nov 24 13:42:13 localhost kernel: hub 1-0:1.0: 2 ports detected
Nov 24 13:42:13 localhost kernel: usbcore: registered new interface driver usbserial_generic
Nov 24 13:42:13 localhost kernel: usbserial: USB Serial support registered for generic
Nov 24 13:42:13 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 24 13:42:13 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 24 13:42:13 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 24 13:42:13 localhost kernel: mousedev: PS/2 mouse device common for all mice
Nov 24 13:42:13 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 24 13:42:13 localhost kernel: rtc_cmos 00:04: registered as rtc0
Nov 24 13:42:13 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 24 13:42:13 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-11-24T13:42:12 UTC (1763991732)
Nov 24 13:42:13 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 24 13:42:13 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 24 13:42:13 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 24 13:42:13 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 24 13:42:13 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 24 13:42:13 localhost kernel: usbcore: registered new interface driver usbhid
Nov 24 13:42:13 localhost kernel: usbhid: USB HID core driver
Nov 24 13:42:13 localhost kernel: drop_monitor: Initializing network drop monitor service
Nov 24 13:42:13 localhost kernel: Initializing XFRM netlink socket
Nov 24 13:42:13 localhost kernel: NET: Registered PF_INET6 protocol family
Nov 24 13:42:13 localhost kernel: Segment Routing with IPv6
Nov 24 13:42:13 localhost kernel: NET: Registered PF_PACKET protocol family
Nov 24 13:42:13 localhost kernel: mpls_gso: MPLS GSO support
Nov 24 13:42:13 localhost kernel: IPI shorthand broadcast: enabled
Nov 24 13:42:13 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Nov 24 13:42:13 localhost kernel: AES CTR mode by8 optimization enabled
Nov 24 13:42:13 localhost kernel: sched_clock: Marking stable (1296017291, 153617705)->(1596416900, -146781904)
Nov 24 13:42:13 localhost kernel: registered taskstats version 1
Nov 24 13:42:13 localhost kernel: Loading compiled-in X.509 certificates
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 24 13:42:13 localhost kernel: Demotion targets for Node 0: null
Nov 24 13:42:13 localhost kernel: page_owner is disabled
Nov 24 13:42:13 localhost kernel: Key type .fscrypt registered
Nov 24 13:42:13 localhost kernel: Key type fscrypt-provisioning registered
Nov 24 13:42:13 localhost kernel: Key type big_key registered
Nov 24 13:42:13 localhost kernel: Key type encrypted registered
Nov 24 13:42:13 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 24 13:42:13 localhost kernel: Loading compiled-in module X.509 certificates
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: f7751431c703da8a75244ce96aad68601cf1c188'
Nov 24 13:42:13 localhost kernel: ima: Allocated hash algorithm: sha256
Nov 24 13:42:13 localhost kernel: ima: No architecture policies found
Nov 24 13:42:13 localhost kernel: evm: Initialising EVM extended attributes:
Nov 24 13:42:13 localhost kernel: evm: security.selinux
Nov 24 13:42:13 localhost kernel: evm: security.SMACK64 (disabled)
Nov 24 13:42:13 localhost kernel: evm: security.SMACK64EXEC (disabled)
Nov 24 13:42:13 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 24 13:42:13 localhost kernel: evm: security.SMACK64MMAP (disabled)
Nov 24 13:42:13 localhost kernel: evm: security.apparmor (disabled)
Nov 24 13:42:13 localhost kernel: evm: security.ima
Nov 24 13:42:13 localhost kernel: evm: security.capability
Nov 24 13:42:13 localhost kernel: evm: HMAC attrs: 0x1
Nov 24 13:42:13 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 24 13:42:13 localhost kernel: Running certificate verification RSA selftest
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 24 13:42:13 localhost kernel: Running certificate verification ECDSA selftest
Nov 24 13:42:13 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 24 13:42:13 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 24 13:42:13 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Nov 24 13:42:13 localhost kernel: usb 1-1: Manufacturer: QEMU
Nov 24 13:42:13 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 24 13:42:13 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 24 13:42:13 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 24 13:42:13 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 24 13:42:13 localhost kernel: clk: Disabling unused clocks
Nov 24 13:42:13 localhost kernel: Freeing unused decrypted memory: 2028K
Nov 24 13:42:13 localhost kernel: Freeing unused kernel image (initmem) memory: 4188K
Nov 24 13:42:13 localhost kernel: Write protecting the kernel read-only data: 30720k
Nov 24 13:42:13 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 24 13:42:13 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 24 13:42:13 localhost kernel: Run /init as init process
Nov 24 13:42:13 localhost kernel:   with arguments:
Nov 24 13:42:13 localhost kernel:     /init
Nov 24 13:42:13 localhost kernel:   with environment:
Nov 24 13:42:13 localhost kernel:     HOME=/
Nov 24 13:42:13 localhost kernel:     TERM=linux
Nov 24 13:42:13 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64
Nov 24 13:42:13 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 24 13:42:13 localhost systemd[1]: Detected virtualization kvm.
Nov 24 13:42:13 localhost systemd[1]: Detected architecture x86-64.
Nov 24 13:42:13 localhost systemd[1]: Running in initrd.
Nov 24 13:42:13 localhost systemd[1]: No hostname configured, using default hostname.
Nov 24 13:42:13 localhost systemd[1]: Hostname set to <localhost>.
Nov 24 13:42:13 localhost systemd[1]: Initializing machine ID from VM UUID.
Nov 24 13:42:13 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Nov 24 13:42:13 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 24 13:42:13 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 24 13:42:13 localhost systemd[1]: Reached target Initrd /usr File System.
Nov 24 13:42:13 localhost systemd[1]: Reached target Local File Systems.
Nov 24 13:42:13 localhost systemd[1]: Reached target Path Units.
Nov 24 13:42:13 localhost systemd[1]: Reached target Slice Units.
Nov 24 13:42:13 localhost systemd[1]: Reached target Swaps.
Nov 24 13:42:13 localhost systemd[1]: Reached target Timer Units.
Nov 24 13:42:13 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 24 13:42:13 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Nov 24 13:42:13 localhost systemd[1]: Listening on Journal Socket.
Nov 24 13:42:13 localhost systemd[1]: Listening on udev Control Socket.
Nov 24 13:42:13 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 24 13:42:13 localhost systemd[1]: Reached target Socket Units.
Nov 24 13:42:13 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 24 13:42:13 localhost systemd[1]: Starting Journal Service...
Nov 24 13:42:13 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 24 13:42:13 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 24 13:42:13 localhost systemd[1]: Starting Create System Users...
Nov 24 13:42:13 localhost systemd[1]: Starting Setup Virtual Console...
Nov 24 13:42:13 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 24 13:42:13 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 24 13:42:13 localhost systemd[1]: Finished Create System Users.
Nov 24 13:42:13 localhost systemd-journald[306]: Journal started
Nov 24 13:42:13 localhost systemd-journald[306]: Runtime Journal (/run/log/journal/29821a9d05ed4e9eb48f8dca86832284) is 8.0M, max 153.6M, 145.6M free.
Nov 24 13:42:13 localhost systemd-sysusers[310]: Creating group 'users' with GID 100.
Nov 24 13:42:13 localhost systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Nov 24 13:42:13 localhost systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 24 13:42:13 localhost systemd[1]: Started Journal Service.
Nov 24 13:42:13 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 24 13:42:13 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 24 13:42:13 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 24 13:42:13 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 24 13:42:13 localhost systemd[1]: Finished Setup Virtual Console.
Nov 24 13:42:13 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 24 13:42:13 localhost systemd[1]: Starting dracut cmdline hook...
Nov 24 13:42:13 localhost dracut-cmdline[326]: dracut-9 dracut-057-102.git20250818.el9
Nov 24 13:42:13 localhost dracut-cmdline[326]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-639.el9.x86_64 root=UUID=47e3724e-7a1b-439a-9543-b98c9a290709 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 24 13:42:13 localhost systemd[1]: Finished dracut cmdline hook.
Nov 24 13:42:13 localhost systemd[1]: Starting dracut pre-udev hook...
Nov 24 13:42:13 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 24 13:42:13 localhost kernel: device-mapper: uevent: version 1.0.3
Nov 24 13:42:13 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 24 13:42:13 localhost kernel: RPC: Registered named UNIX socket transport module.
Nov 24 13:42:13 localhost kernel: RPC: Registered udp transport module.
Nov 24 13:42:13 localhost kernel: RPC: Registered tcp transport module.
Nov 24 13:42:13 localhost kernel: RPC: Registered tcp-with-tls transport module.
Nov 24 13:42:13 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 24 13:42:13 localhost rpc.statd[442]: Version 2.5.4 starting
Nov 24 13:42:13 localhost rpc.statd[442]: Initializing NSM state
Nov 24 13:42:14 localhost rpc.idmapd[447]: Setting log level to 0
Nov 24 13:42:14 localhost systemd[1]: Finished dracut pre-udev hook.
Nov 24 13:42:14 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 24 13:42:14 localhost systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Nov 24 13:42:14 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 24 13:42:14 localhost systemd[1]: Starting dracut pre-trigger hook...
Nov 24 13:42:14 localhost systemd[1]: Finished dracut pre-trigger hook.
Nov 24 13:42:14 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 24 13:42:14 localhost systemd[1]: Created slice Slice /system/modprobe.
Nov 24 13:42:14 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 13:42:14 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 24 13:42:14 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 13:42:14 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 13:42:14 localhost systemd[1]: Mounting Kernel Configuration File System...
Nov 24 13:42:14 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 24 13:42:14 localhost systemd[1]: Reached target Network.
Nov 24 13:42:14 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 24 13:42:14 localhost systemd[1]: Starting dracut initqueue hook...
Nov 24 13:42:14 localhost systemd[1]: Mounted Kernel Configuration File System.
Nov 24 13:42:14 localhost systemd[1]: Reached target System Initialization.
Nov 24 13:42:14 localhost systemd[1]: Reached target Basic System.
Nov 24 13:42:14 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 24 13:42:14 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 24 13:42:14 localhost kernel:  vda: vda1
Nov 24 13:42:14 localhost kernel: libata version 3.00 loaded.
Nov 24 13:42:14 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Nov 24 13:42:14 localhost kernel: scsi host0: ata_piix
Nov 24 13:42:14 localhost kernel: scsi host1: ata_piix
Nov 24 13:42:14 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 24 13:42:14 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 24 13:42:14 localhost systemd[1]: Found device /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 24 13:42:14 localhost systemd[1]: Reached target Initrd Root Device.
Nov 24 13:42:14 localhost kernel: ata1: found unknown device (class 0)
Nov 24 13:42:14 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 24 13:42:14 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 24 13:42:14 localhost systemd-udevd[475]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:42:14 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 24 13:42:14 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 24 13:42:14 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 24 13:42:14 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Nov 24 13:42:14 localhost systemd[1]: Finished dracut initqueue hook.
Nov 24 13:42:14 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Nov 24 13:42:14 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Nov 24 13:42:14 localhost systemd[1]: Reached target Remote File Systems.
Nov 24 13:42:14 localhost systemd[1]: Starting dracut pre-mount hook...
Nov 24 13:42:14 localhost systemd[1]: Finished dracut pre-mount hook.
Nov 24 13:42:14 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709...
Nov 24 13:42:14 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Nov 24 13:42:14 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709.
Nov 24 13:42:14 localhost systemd[1]: Mounting /sysroot...
Nov 24 13:42:15 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 24 13:42:15 localhost kernel: XFS (vda1): Mounting V5 Filesystem 47e3724e-7a1b-439a-9543-b98c9a290709
Nov 24 13:42:15 localhost kernel: XFS (vda1): Ending clean mount
Nov 24 13:42:15 localhost systemd[1]: Mounted /sysroot.
Nov 24 13:42:15 localhost systemd[1]: Reached target Initrd Root File System.
Nov 24 13:42:15 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 24 13:42:15 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 24 13:42:15 localhost systemd[1]: Reached target Initrd File Systems.
Nov 24 13:42:15 localhost systemd[1]: Reached target Initrd Default Target.
Nov 24 13:42:15 localhost systemd[1]: Starting dracut mount hook...
Nov 24 13:42:15 localhost systemd[1]: Finished dracut mount hook.
Nov 24 13:42:15 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 24 13:42:15 localhost rpc.idmapd[447]: exiting on signal 15
Nov 24 13:42:15 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 24 13:42:15 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 24 13:42:15 localhost systemd[1]: Stopped target Network.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Timer Units.
Nov 24 13:42:15 localhost systemd[1]: dbus.socket: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 24 13:42:15 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Initrd Default Target.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Basic System.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Initrd Root Device.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Initrd /usr File System.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Path Units.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Remote File Systems.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Slice Units.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Socket Units.
Nov 24 13:42:15 localhost systemd[1]: Stopped target System Initialization.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Local File Systems.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Swaps.
Nov 24 13:42:15 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut mount hook.
Nov 24 13:42:15 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut pre-mount hook.
Nov 24 13:42:15 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Nov 24 13:42:15 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 24 13:42:15 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut initqueue hook.
Nov 24 13:42:15 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Apply Kernel Variables.
Nov 24 13:42:15 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Nov 24 13:42:15 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Coldplug All udev Devices.
Nov 24 13:42:15 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut pre-trigger hook.
Nov 24 13:42:15 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 24 13:42:15 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Setup Virtual Console.
Nov 24 13:42:15 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 24 13:42:15 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 24 13:42:15 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Closed udev Control Socket.
Nov 24 13:42:15 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Closed udev Kernel Socket.
Nov 24 13:42:15 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut pre-udev hook.
Nov 24 13:42:15 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped dracut cmdline hook.
Nov 24 13:42:15 localhost systemd[1]: Starting Cleanup udev Database...
Nov 24 13:42:15 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 24 13:42:15 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Nov 24 13:42:15 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Stopped Create System Users.
Nov 24 13:42:15 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 24 13:42:15 localhost systemd[1]: Finished Cleanup udev Database.
Nov 24 13:42:15 localhost systemd[1]: Reached target Switch Root.
Nov 24 13:42:15 localhost systemd[1]: Starting Switch Root...
Nov 24 13:42:15 localhost systemd[1]: Switching root.
Nov 24 13:42:15 localhost systemd-journald[306]: Journal stopped
Nov 24 13:42:16 localhost systemd-journald[306]: Received SIGTERM from PID 1 (systemd).
Nov 24 13:42:16 localhost kernel: audit: type=1404 audit(1763991735.959:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability open_perms=1
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:42:16 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:42:16 localhost kernel: audit: type=1403 audit(1763991736.131:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 24 13:42:16 localhost systemd[1]: Successfully loaded SELinux policy in 176.805ms.
Nov 24 13:42:16 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.521ms.
Nov 24 13:42:16 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 24 13:42:16 localhost systemd[1]: Detected virtualization kvm.
Nov 24 13:42:16 localhost systemd[1]: Detected architecture x86-64.
Nov 24 13:42:16 localhost systemd-rc-local-generator[637]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:42:16 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Stopped Switch Root.
Nov 24 13:42:16 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 24 13:42:16 localhost systemd[1]: Created slice Slice /system/getty.
Nov 24 13:42:16 localhost systemd[1]: Created slice Slice /system/serial-getty.
Nov 24 13:42:16 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Nov 24 13:42:16 localhost systemd[1]: Created slice User and Session Slice.
Nov 24 13:42:16 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Nov 24 13:42:16 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Nov 24 13:42:16 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 24 13:42:16 localhost systemd[1]: Reached target Local Encrypted Volumes.
Nov 24 13:42:16 localhost systemd[1]: Stopped target Switch Root.
Nov 24 13:42:16 localhost systemd[1]: Stopped target Initrd File Systems.
Nov 24 13:42:16 localhost systemd[1]: Stopped target Initrd Root File System.
Nov 24 13:42:16 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Nov 24 13:42:16 localhost systemd[1]: Reached target Path Units.
Nov 24 13:42:16 localhost systemd[1]: Reached target rpc_pipefs.target.
Nov 24 13:42:16 localhost systemd[1]: Reached target Slice Units.
Nov 24 13:42:16 localhost systemd[1]: Reached target Swaps.
Nov 24 13:42:16 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Nov 24 13:42:16 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Nov 24 13:42:16 localhost systemd[1]: Reached target RPC Port Mapper.
Nov 24 13:42:16 localhost systemd[1]: Listening on Process Core Dump Socket.
Nov 24 13:42:16 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Nov 24 13:42:16 localhost systemd[1]: Listening on udev Control Socket.
Nov 24 13:42:16 localhost systemd[1]: Listening on udev Kernel Socket.
Nov 24 13:42:16 localhost systemd[1]: Mounting Huge Pages File System...
Nov 24 13:42:16 localhost systemd[1]: Mounting POSIX Message Queue File System...
Nov 24 13:42:16 localhost systemd[1]: Mounting Kernel Debug File System...
Nov 24 13:42:16 localhost systemd[1]: Mounting Kernel Trace File System...
Nov 24 13:42:16 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 24 13:42:16 localhost systemd[1]: Starting Create List of Static Device Nodes...
Nov 24 13:42:16 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 13:42:16 localhost systemd[1]: Starting Load Kernel Module drm...
Nov 24 13:42:16 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Nov 24 13:42:16 localhost systemd[1]: Starting Load Kernel Module fuse...
Nov 24 13:42:16 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 24 13:42:16 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Stopped File System Check on Root Device.
Nov 24 13:42:16 localhost systemd[1]: Stopped Journal Service.
Nov 24 13:42:16 localhost systemd[1]: Starting Journal Service...
Nov 24 13:42:16 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 24 13:42:16 localhost systemd[1]: Starting Generate network units from Kernel command line...
Nov 24 13:42:16 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 13:42:16 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Nov 24 13:42:16 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 24 13:42:16 localhost systemd[1]: Starting Apply Kernel Variables...
Nov 24 13:42:16 localhost systemd[1]: Starting Coldplug All udev Devices...
Nov 24 13:42:16 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 24 13:42:16 localhost kernel: ACPI: bus type drm_connector registered
Nov 24 13:42:16 localhost systemd[1]: Mounted Huge Pages File System.
Nov 24 13:42:16 localhost systemd[1]: Mounted POSIX Message Queue File System.
Nov 24 13:42:16 localhost kernel: fuse: init (API version 7.37)
Nov 24 13:42:16 localhost systemd[1]: Mounted Kernel Debug File System.
Nov 24 13:42:16 localhost systemd-journald[679]: Journal started
Nov 24 13:42:16 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 24 13:42:16 localhost systemd[1]: Queued start job for default target Multi-User System.
Nov 24 13:42:16 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Started Journal Service.
Nov 24 13:42:16 localhost systemd[1]: Mounted Kernel Trace File System.
Nov 24 13:42:16 localhost systemd[1]: Finished Create List of Static Device Nodes.
Nov 24 13:42:16 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 13:42:16 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Finished Load Kernel Module drm.
Nov 24 13:42:16 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 24 13:42:16 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 24 13:42:16 localhost systemd[1]: Finished Load Kernel Module fuse.
Nov 24 13:42:16 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 24 13:42:16 localhost systemd[1]: Finished Generate network units from Kernel command line.
Nov 24 13:42:16 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 24 13:42:16 localhost systemd[1]: Finished Apply Kernel Variables.
Nov 24 13:42:16 localhost systemd[1]: Mounting FUSE Control File System...
Nov 24 13:42:16 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 24 13:42:16 localhost systemd[1]: Starting Rebuild Hardware Database...
Nov 24 13:42:16 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 24 13:42:16 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 24 13:42:16 localhost systemd[1]: Starting Load/Save OS Random Seed...
Nov 24 13:42:16 localhost systemd[1]: Starting Create System Users...
Nov 24 13:42:16 localhost systemd[1]: Mounted FUSE Control File System.
Nov 24 13:42:16 localhost systemd-journald[679]: Runtime Journal (/run/log/journal/fee38d0f94bf6f4b17ec77ba536bd6ab) is 8.0M, max 153.6M, 145.6M free.
Nov 24 13:42:16 localhost systemd-journald[679]: Received client request to flush runtime journal.
Nov 24 13:42:16 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 24 13:42:16 localhost systemd[1]: Finished Load/Save OS Random Seed.
Nov 24 13:42:16 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 24 13:42:16 localhost systemd[1]: Finished Coldplug All udev Devices.
Nov 24 13:42:16 localhost systemd[1]: Finished Create System Users.
Nov 24 13:42:16 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 24 13:42:17 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 24 13:42:17 localhost systemd[1]: Reached target Preparation for Local File Systems.
Nov 24 13:42:17 localhost systemd[1]: Reached target Local File Systems.
Nov 24 13:42:17 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 24 13:42:17 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 24 13:42:17 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 24 13:42:17 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 24 13:42:17 localhost systemd[1]: Starting Automatic Boot Loader Update...
Nov 24 13:42:17 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 24 13:42:17 localhost systemd[1]: Starting Create Volatile Files and Directories...
Nov 24 13:42:17 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Nov 24 13:42:17 localhost systemd[1]: Finished Automatic Boot Loader Update.
Nov 24 13:42:17 localhost systemd[1]: Finished Create Volatile Files and Directories.
Nov 24 13:42:17 localhost systemd[1]: Starting Security Auditing Service...
Nov 24 13:42:17 localhost systemd[1]: Starting RPC Bind...
Nov 24 13:42:17 localhost systemd[1]: Starting Rebuild Journal Catalog...
Nov 24 13:42:17 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 24 13:42:17 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 24 13:42:17 localhost systemd[1]: Finished Rebuild Journal Catalog.
Nov 24 13:42:17 localhost augenrules[708]: /sbin/augenrules: No change
Nov 24 13:42:17 localhost systemd[1]: Started RPC Bind.
Nov 24 13:42:17 localhost augenrules[723]: No rules
Nov 24 13:42:17 localhost augenrules[723]: enabled 1
Nov 24 13:42:17 localhost augenrules[723]: failure 1
Nov 24 13:42:17 localhost augenrules[723]: pid 703
Nov 24 13:42:17 localhost augenrules[723]: rate_limit 0
Nov 24 13:42:17 localhost augenrules[723]: backlog_limit 8192
Nov 24 13:42:17 localhost augenrules[723]: lost 0
Nov 24 13:42:17 localhost augenrules[723]: backlog 2
Nov 24 13:42:17 localhost augenrules[723]: backlog_wait_time 60000
Nov 24 13:42:17 localhost augenrules[723]: backlog_wait_time_actual 0
Nov 24 13:42:17 localhost augenrules[723]: enabled 1
Nov 24 13:42:17 localhost augenrules[723]: failure 1
Nov 24 13:42:17 localhost augenrules[723]: pid 703
Nov 24 13:42:17 localhost augenrules[723]: rate_limit 0
Nov 24 13:42:17 localhost augenrules[723]: backlog_limit 8192
Nov 24 13:42:17 localhost augenrules[723]: lost 0
Nov 24 13:42:17 localhost augenrules[723]: backlog 2
Nov 24 13:42:17 localhost augenrules[723]: backlog_wait_time 60000
Nov 24 13:42:17 localhost augenrules[723]: backlog_wait_time_actual 0
Nov 24 13:42:17 localhost augenrules[723]: enabled 1
Nov 24 13:42:17 localhost augenrules[723]: failure 1
Nov 24 13:42:17 localhost augenrules[723]: pid 703
Nov 24 13:42:17 localhost augenrules[723]: rate_limit 0
Nov 24 13:42:17 localhost augenrules[723]: backlog_limit 8192
Nov 24 13:42:17 localhost augenrules[723]: lost 0
Nov 24 13:42:17 localhost augenrules[723]: backlog 0
Nov 24 13:42:17 localhost augenrules[723]: backlog_wait_time 60000
Nov 24 13:42:17 localhost augenrules[723]: backlog_wait_time_actual 0
Nov 24 13:42:17 localhost systemd[1]: Started Security Auditing Service.
Nov 24 13:42:17 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 24 13:42:17 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 24 13:42:17 localhost systemd[1]: Finished Rebuild Hardware Database.
Nov 24 13:42:17 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 24 13:42:17 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 24 13:42:17 localhost systemd[1]: Starting Update is Completed...
Nov 24 13:42:17 localhost systemd[1]: Finished Update is Completed.
Nov 24 13:42:17 localhost systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Nov 24 13:42:17 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 24 13:42:17 localhost systemd[1]: Reached target System Initialization.
Nov 24 13:42:17 localhost systemd[1]: Started dnf makecache --timer.
Nov 24 13:42:17 localhost systemd[1]: Started Daily rotation of log files.
Nov 24 13:42:17 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 24 13:42:17 localhost systemd[1]: Reached target Timer Units.
Nov 24 13:42:17 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 24 13:42:17 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 24 13:42:17 localhost systemd[1]: Reached target Socket Units.
Nov 24 13:42:17 localhost systemd[1]: Starting D-Bus System Message Bus...
Nov 24 13:42:17 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 13:42:17 localhost systemd[1]: Starting Load Kernel Module configfs...
Nov 24 13:42:17 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 24 13:42:17 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 24 13:42:17 localhost systemd[1]: Finished Load Kernel Module configfs.
Nov 24 13:42:17 localhost systemd-udevd[748]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:42:17 localhost systemd[1]: Started D-Bus System Message Bus.
Nov 24 13:42:17 localhost systemd[1]: Reached target Basic System.
Nov 24 13:42:17 localhost dbus-broker-lau[759]: Ready
Nov 24 13:42:17 localhost systemd[1]: Starting NTP client/server...
Nov 24 13:42:17 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 24 13:42:17 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 24 13:42:17 localhost systemd[1]: Starting IPv4 firewall with iptables...
Nov 24 13:42:17 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 24 13:42:17 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 24 13:42:17 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 24 13:42:17 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 24 13:42:17 localhost chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 24 13:42:17 localhost chronyd[792]: Loaded 0 symmetric keys
Nov 24 13:42:17 localhost chronyd[792]: Using right/UTC timezone to obtain leap second data
Nov 24 13:42:17 localhost chronyd[792]: Loaded seccomp filter (level 2)
Nov 24 13:42:17 localhost systemd[1]: Started irqbalance daemon.
Nov 24 13:42:17 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 24 13:42:17 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 24 13:42:17 localhost kernel: kvm_amd: TSC scaling supported
Nov 24 13:42:17 localhost kernel: kvm_amd: Nested Virtualization enabled
Nov 24 13:42:17 localhost kernel: kvm_amd: Nested Paging enabled
Nov 24 13:42:17 localhost kernel: kvm_amd: LBR virtualization supported
Nov 24 13:42:17 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 24 13:42:17 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 24 13:42:17 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 24 13:42:17 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 13:42:17 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 13:42:17 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 13:42:17 localhost systemd[1]: Reached target sshd-keygen.target.
Nov 24 13:42:17 localhost kernel: Console: switching to colour dummy device 80x25
Nov 24 13:42:17 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 24 13:42:17 localhost kernel: [drm] features: -context_init
Nov 24 13:42:17 localhost kernel: [drm] number of scanouts: 1
Nov 24 13:42:17 localhost kernel: [drm] number of cap sets: 0
Nov 24 13:42:17 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 24 13:42:17 localhost systemd[1]: Reached target User and Group Name Lookups.
Nov 24 13:42:17 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 24 13:42:17 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 24 13:42:17 localhost kernel: Console: switching to colour frame buffer device 128x48
Nov 24 13:42:17 localhost systemd[1]: Starting User Login Management...
Nov 24 13:42:18 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 24 13:42:18 localhost systemd[1]: Started NTP client/server.
Nov 24 13:42:18 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 24 13:42:18 localhost systemd-logind[807]: New seat seat0.
Nov 24 13:42:18 localhost systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 24 13:42:18 localhost systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 24 13:42:18 localhost systemd[1]: Started User Login Management.
Nov 24 13:42:18 localhost iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Nov 24 13:42:18 localhost systemd[1]: Finished IPv4 firewall with iptables.
Nov 24 13:42:18 localhost cloud-init[840]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 24 Nov 2025 13:42:18 +0000. Up 7.16 seconds.
Nov 24 13:42:18 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Nov 24 13:42:18 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Nov 24 13:42:18 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpf1yj6uo8.mount: Deactivated successfully.
Nov 24 13:42:18 localhost systemd[1]: Starting Hostname Service...
Nov 24 13:42:18 localhost systemd[1]: Started Hostname Service.
Nov 24 13:42:18 np0005533658.novalocal systemd-hostnamed[854]: Hostname set to <np0005533658.novalocal> (static)
Nov 24 13:42:18 np0005533658.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 24 13:42:18 np0005533658.novalocal systemd[1]: Reached target Preparation for Network.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Starting Network Manager...
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0655] NetworkManager (version 1.54.1-1.el9) is starting... (boot:07f12fa4-da71-4102-93d7-808aadc1fc71)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0662] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0825] manager[0x56326b459080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0881] hostname: hostname: using hostnamed
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0881] hostname: static hostname changed from (none) to "np0005533658.novalocal"
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0885] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0992] manager[0x56326b459080]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.0995] manager[0x56326b459080]: rfkill: WWAN hardware radio set enabled
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1104] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1105] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1106] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1107] manager: Networking is enabled by state file
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1110] settings: Loaded settings plugin: keyfile (internal)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1140] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1178] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1211] dhcp: init: Using DHCP client 'internal'
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1215] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1239] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1259] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1271] device (lo): Activation: starting connection 'lo' (e622f3c8-2558-4f9f-886b-8e216d717dfd)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1285] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1290] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Started Network Manager.
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1351] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1360] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1365] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1369] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1372] device (eth0): carrier: link connected
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1379] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1389] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Reached target Network.
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1401] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1408] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1410] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1414] manager: NetworkManager state is now CONNECTING
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1417] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1426] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1431] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1476] dhcp4 (eth0): state changed new lease, address=38.102.83.214
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1484] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1510] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1558] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1560] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1561] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1569] device (lo): Activation: successful, device activated.
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1576] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1582] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1585] device (eth0): Activation: successful, device activated.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1592] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 13:42:19 np0005533658.novalocal NetworkManager[858]: <info>  [1763991739.1596] manager: startup complete
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Reached target NFS client services.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: Reached target Remote File Systems.
Nov 24 13:42:19 np0005533658.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 24 Nov 2025 13:42:19 +0000. Up 8.27 seconds.
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |  eth0  | True |        38.102.83.214         | 255.255.255.0 | global | fa:16:3e:7e:a2:2e |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:fe7e:a22e/64 |       .       |  link  | fa:16:3e:7e:a2:2e |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 24 13:42:19 np0005533658.novalocal cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 24 13:42:20 np0005533658.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Nov 24 13:42:20 np0005533658.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Nov 24 13:42:20 np0005533658.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Nov 24 13:42:20 np0005533658.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Nov 24 13:42:20 np0005533658.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Nov 24 13:42:20 np0005533658.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Generating public/private rsa key pair.
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: The key fingerprint is:
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: SHA256:OTNOGWxgbrttyqAy5gRx5R1fudEMEGoSRqcjsmosnvc root@np0005533658.novalocal
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: The key's randomart image is:
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: +---[RSA 3072]----+
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |   .= = ooo=     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |   + B * .o o    |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |o o = * =  o     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: | = . = o +.      |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |o     . S        |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |+      = +       |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |o+  . . +        |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |B..o o o         |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |o*o .Eo          |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Generating public/private ecdsa key pair.
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: The key fingerprint is:
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: SHA256:HTbQtIrYx7A5XwyFvr6cP44Z1odLkO2WGpGcSUlo1LI root@np0005533658.novalocal
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: The key's randomart image is:
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: +---[ECDSA 256]---+
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |     ..oo+o      |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |      +.o+..     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |     ..++ =      |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |     oEO.% o     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |    . * S.=      |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |       +.* o     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |       .= B .    |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |       o.X.o     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |        B++.     |
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Generating public/private ed25519 key pair.
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: The key fingerprint is:
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: SHA256:1Zdn0/spwt0SHV8tflaVKZ/m3eCH3HNtZ6zy3a33ngE root@np0005533658.novalocal
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: The key's randomart image is:
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: +--[ED25519 256]--+
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |                +|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |           . . ++|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |          . . *oO|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |         .   ooBB|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |        S    E+BB|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |          . . B=#|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |           o + O=|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |            o +.*|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: |             oo=B|
Nov 24 13:42:20 np0005533658.novalocal cloud-init[921]: +----[SHA256]-----+
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Reached target Cloud-config availability.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Reached target Network is Online.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Starting Crash recovery kernel arming...
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Starting System Logging Service...
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Starting OpenSSH server daemon...
Nov 24 13:42:20 np0005533658.novalocal sm-notify[1004]: Version 2.5.4 starting
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Starting Permit User Sessions...
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Started Notify NFS peers of a restart.
Nov 24 13:42:20 np0005533658.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Nov 24 13:42:20 np0005533658.novalocal sshd[1006]: Server listening on :: port 22.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Started OpenSSH server daemon.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Finished Permit User Sessions.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Started Command Scheduler.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Started Getty on tty1.
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Started Serial Getty on ttyS0.
Nov 24 13:42:20 np0005533658.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Nov 24 13:42:20 np0005533658.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Nov 24 13:42:20 np0005533658.novalocal systemd[1]: Reached target Login Prompts.
Nov 24 13:42:20 np0005533658.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 68% if used.)
Nov 24 13:42:20 np0005533658.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Nov 24 13:42:21 np0005533658.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Started System Logging Service.
Nov 24 13:42:21 np0005533658.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Reached target Multi-User System.
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 24 13:42:21 np0005533658.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 13:42:21 np0005533658.novalocal kdumpctl[1015]: kdump: No kdump initial ramdisk found.
Nov 24 13:42:21 np0005533658.novalocal kdumpctl[1015]: kdump: Rebuilding /boot/initramfs-5.14.0-639.el9.x86_64kdump.img
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1064]: Connection reset by 38.102.83.114 port 33492 [preauth]
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1072]: Unable to negotiate with 38.102.83.114 port 33502: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1081]: Connection closed by 38.102.83.114 port 33506 [preauth]
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1092]: Unable to negotiate with 38.102.83.114 port 33508: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1104]: Unable to negotiate with 38.102.83.114 port 33520: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1111]: Connection closed by 38.102.83.114 port 33532 [preauth]
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1122]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 24 Nov 2025 13:42:21 +0000. Up 9.94 seconds.
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1143]: Unable to negotiate with 38.102.83.114 port 33548: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1150]: Unable to negotiate with 38.102.83.114 port 33554: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Nov 24 13:42:21 np0005533658.novalocal sshd-session[1130]: Connection closed by 38.102.83.114 port 33534 [preauth]
Nov 24 13:42:21 np0005533658.novalocal dracut[1286]: dracut-057-102.git20250818.el9
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1302]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 24 Nov 2025 13:42:21 +0000. Up 10.34 seconds.
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1304]: #############################################################
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1305]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1307]: 256 SHA256:HTbQtIrYx7A5XwyFvr6cP44Z1odLkO2WGpGcSUlo1LI root@np0005533658.novalocal (ECDSA)
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1311]: 256 SHA256:1Zdn0/spwt0SHV8tflaVKZ/m3eCH3HNtZ6zy3a33ngE root@np0005533658.novalocal (ED25519)
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1316]: 3072 SHA256:OTNOGWxgbrttyqAy5gRx5R1fudEMEGoSRqcjsmosnvc root@np0005533658.novalocal (RSA)
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1319]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1322]: #############################################################
Nov 24 13:42:21 np0005533658.novalocal dracut[1288]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/47e3724e-7a1b-439a-9543-b98c9a290709 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-639.el9.x86_64kdump.img 5.14.0-639.el9.x86_64
Nov 24 13:42:21 np0005533658.novalocal cloud-init[1302]: Cloud-init v. 24.4-7.el9 finished at Mon, 24 Nov 2025 13:42:21 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.52 seconds
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Nov 24 13:42:21 np0005533658.novalocal systemd[1]: Reached target Cloud-init target.
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: Module 'resume' will not be installed, because it's in the list to be omitted!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 24 13:42:22 np0005533658.novalocal dracut[1288]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: memstrack is not available
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: memstrack is not available
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: *** Including module: systemd ***
Nov 24 13:42:23 np0005533658.novalocal dracut[1288]: *** Including module: fips ***
Nov 24 13:42:24 np0005533658.novalocal dracut[1288]: *** Including module: systemd-initrd ***
Nov 24 13:42:24 np0005533658.novalocal chronyd[792]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Nov 24 13:42:24 np0005533658.novalocal chronyd[792]: System clock wrong by 1.260958 seconds
Nov 24 13:42:25 np0005533658.novalocal chronyd[792]: System clock was stepped by 1.260958 seconds
Nov 24 13:42:25 np0005533658.novalocal chronyd[792]: System clock TAI offset set to 37 seconds
Nov 24 13:42:25 np0005533658.novalocal dracut[1288]: *** Including module: i18n ***
Nov 24 13:42:25 np0005533658.novalocal dracut[1288]: *** Including module: drm ***
Nov 24 13:42:25 np0005533658.novalocal dracut[1288]: *** Including module: prefixdevname ***
Nov 24 13:42:25 np0005533658.novalocal dracut[1288]: *** Including module: kernel-modules ***
Nov 24 13:42:26 np0005533658.novalocal kernel: block vda: the capability attribute has been deprecated.
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]: *** Including module: kernel-modules-extra ***
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]: *** Including module: qemu ***
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]: *** Including module: fstab-sys ***
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]: *** Including module: rootfs-block ***
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]: *** Including module: terminfo ***
Nov 24 13:42:26 np0005533658.novalocal dracut[1288]: *** Including module: udev-rules ***
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: Skipping udev rule: 91-permissions.rules
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: *** Including module: virtiofs ***
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: *** Including module: dracut-systemd ***
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: *** Including module: usrmount ***
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: *** Including module: base ***
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: *** Including module: fs-lib ***
Nov 24 13:42:27 np0005533658.novalocal dracut[1288]: *** Including module: kdumpbase ***
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:   microcode_ctl module: mangling fw_dir
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]: *** Including module: openssl ***
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]: *** Including module: shutdown ***
Nov 24 13:42:28 np0005533658.novalocal dracut[1288]: *** Including module: squash ***
Nov 24 13:42:29 np0005533658.novalocal dracut[1288]: *** Including modules done ***
Nov 24 13:42:29 np0005533658.novalocal dracut[1288]: *** Installing kernel module dependencies ***
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: IRQ 25 affinity is now unmanaged
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: IRQ 31 affinity is now unmanaged
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: IRQ 28 affinity is now unmanaged
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: IRQ 32 affinity is now unmanaged
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: IRQ 30 affinity is now unmanaged
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 24 13:42:29 np0005533658.novalocal irqbalance[784]: IRQ 29 affinity is now unmanaged
Nov 24 13:42:29 np0005533658.novalocal dracut[1288]: *** Installing kernel module dependencies done ***
Nov 24 13:42:29 np0005533658.novalocal dracut[1288]: *** Resolving executable dependencies ***
Nov 24 13:42:30 np0005533658.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 13:42:31 np0005533658.novalocal dracut[1288]: *** Resolving executable dependencies done ***
Nov 24 13:42:31 np0005533658.novalocal dracut[1288]: *** Generating early-microcode cpio image ***
Nov 24 13:42:31 np0005533658.novalocal dracut[1288]: *** Store current command line parameters ***
Nov 24 13:42:31 np0005533658.novalocal dracut[1288]: Stored kernel commandline:
Nov 24 13:42:31 np0005533658.novalocal dracut[1288]: No dracut internal kernel commandline stored in the initramfs
Nov 24 13:42:31 np0005533658.novalocal dracut[1288]: *** Install squash loader ***
Nov 24 13:42:32 np0005533658.novalocal dracut[1288]: *** Squashing the files inside the initramfs ***
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: *** Squashing the files inside the initramfs done ***
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: *** Creating image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' ***
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: *** Hardlinking files ***
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Mode:           real
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Files:          50
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Linked:         0 files
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Compared:       0 xattrs
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Compared:       0 files
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Saved:          0 B
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: Duration:       0.000470 seconds
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: *** Hardlinking files done ***
Nov 24 13:42:34 np0005533658.novalocal dracut[1288]: *** Creating initramfs image file '/boot/initramfs-5.14.0-639.el9.x86_64kdump.img' done ***
Nov 24 13:42:34 np0005533658.novalocal kdumpctl[1015]: kdump: kexec: loaded kdump kernel
Nov 24 13:42:34 np0005533658.novalocal kdumpctl[1015]: kdump: Starting kdump: [OK]
Nov 24 13:42:35 np0005533658.novalocal systemd[1]: Finished Crash recovery kernel arming.
Nov 24 13:42:35 np0005533658.novalocal systemd[1]: Startup finished in 1.780s (kernel) + 2.918s (initrd) + 17.792s (userspace) = 22.491s.
Nov 24 13:42:39 np0005533658.novalocal sshd-session[4296]: Accepted publickey for zuul from 38.102.83.114 port 48004 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Nov 24 13:42:40 np0005533658.novalocal systemd[1]: Created slice User Slice of UID 1000.
Nov 24 13:42:40 np0005533658.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 24 13:42:40 np0005533658.novalocal systemd-logind[807]: New session 1 of user zuul.
Nov 24 13:42:40 np0005533658.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 24 13:42:40 np0005533658.novalocal systemd[1]: Starting User Manager for UID 1000...
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Queued start job for default target Main User Target.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Created slice User Application Slice.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Reached target Paths.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Reached target Timers.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Starting D-Bus User Message Bus Socket...
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Starting Create User's Volatile Files and Directories...
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Finished Create User's Volatile Files and Directories.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Listening on D-Bus User Message Bus Socket.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Reached target Sockets.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Reached target Basic System.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Reached target Main User Target.
Nov 24 13:42:40 np0005533658.novalocal systemd[4300]: Startup finished in 162ms.
Nov 24 13:42:40 np0005533658.novalocal systemd[1]: Started User Manager for UID 1000.
Nov 24 13:42:40 np0005533658.novalocal systemd[1]: Started Session 1 of User zuul.
Nov 24 13:42:40 np0005533658.novalocal sshd-session[4296]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:42:40 np0005533658.novalocal python3[4382]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:42:43 np0005533658.novalocal python3[4410]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:42:49 np0005533658.novalocal python3[4468]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:42:50 np0005533658.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 13:42:50 np0005533658.novalocal python3[4510]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 24 13:42:52 np0005533658.novalocal python3[4536]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1qNLkG99zuoUf8gxLSU/NkcfF3BTXFjrNzjTJXsSI+AzhWMcbbr6xGSe7eXOMd/h56fWaCZAHTnHypTI63/vuD6QNFxnGCsCZs3GsYLky8ONHnAYt08zlrMYZ0lI9k7y4xeq+/MZ1fMFBJ3PIY8O4Q5LNZdAT8b1UtVIGEkFbJI9hoPFEGh6mm52pZ6Catu/vYSvaFGRANODwKFl8Vt9XnIdc156lpOVNuWKKO+NCdtwZb2BIo1k+Un6mbTBN5I8Awl0z0r0INs1+pHFBOmblotIHsxB5udOokjHlVNoOyLgHU6hcNVsNDL6KtY5pBXqbsC7IsWd/MZaEuoznr7ZXrLKLhc8PZ/SoYbiFsw0PZqpBCgN37sJRyGFJ0lscjClNE5Vg5s+wRAPze4cgzrzJ4pQLcume4WeC7zy4g9ke6KoS5nk4l9WJPO812nECFbF7+/q8aejVTBhGTQdBlkKxYCc1dxNZvIwJzlRGj6629HALRG+D6qArs1giL80xzdM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:42:52 np0005533658.novalocal python3[4560]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:42:53 np0005533658.novalocal python3[4659]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:42:53 np0005533658.novalocal python3[4730]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763991772.9474623-207-43145522743803/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=a165c2ca6e3744b5ad9bbe60b922a225_id_rsa follow=False checksum=addab664920ff7fd2ef375607d43c8d33329dc96 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:42:54 np0005533658.novalocal python3[4853]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:42:54 np0005533658.novalocal python3[4924]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763991773.9316127-240-48418135930083/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=a165c2ca6e3744b5ad9bbe60b922a225_id_rsa.pub follow=False checksum=430a89fd0ad3f929e12266a427fcc32cd5398f2b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:42:56 np0005533658.novalocal python3[4972]: ansible-ping Invoked with data=pong
Nov 24 13:42:57 np0005533658.novalocal python3[4996]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:42:59 np0005533658.novalocal python3[5054]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 24 13:43:00 np0005533658.novalocal python3[5086]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:00 np0005533658.novalocal python3[5110]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:01 np0005533658.novalocal python3[5134]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:01 np0005533658.novalocal python3[5158]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:01 np0005533658.novalocal python3[5182]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:01 np0005533658.novalocal python3[5206]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:03 np0005533658.novalocal sudo[5230]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erhdeiwbezcqqobuygqejdlmjpkpwifp ; /usr/bin/python3'
Nov 24 13:43:03 np0005533658.novalocal sudo[5230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:03 np0005533658.novalocal python3[5232]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:03 np0005533658.novalocal sudo[5230]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:03 np0005533658.novalocal sudo[5308]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koaisueuwchkhmdtlqdsxhkushxoscuf ; /usr/bin/python3'
Nov 24 13:43:03 np0005533658.novalocal sudo[5308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:04 np0005533658.novalocal python3[5310]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:43:04 np0005533658.novalocal sudo[5308]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:04 np0005533658.novalocal sudo[5381]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjcolcxgdxtpmysjdzaapagknopvrxy ; /usr/bin/python3'
Nov 24 13:43:04 np0005533658.novalocal sudo[5381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:04 np0005533658.novalocal python3[5383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763991783.6423295-21-95241083052733/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:04 np0005533658.novalocal sudo[5381]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:05 np0005533658.novalocal python3[5431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:05 np0005533658.novalocal python3[5455]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:05 np0005533658.novalocal python3[5479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:06 np0005533658.novalocal python3[5503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:06 np0005533658.novalocal python3[5527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:06 np0005533658.novalocal python3[5551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:07 np0005533658.novalocal python3[5575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:07 np0005533658.novalocal python3[5599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:07 np0005533658.novalocal python3[5623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:07 np0005533658.novalocal python3[5647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:08 np0005533658.novalocal python3[5671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:08 np0005533658.novalocal python3[5695]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:08 np0005533658.novalocal python3[5719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:09 np0005533658.novalocal python3[5743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:09 np0005533658.novalocal python3[5767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:09 np0005533658.novalocal python3[5791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:10 np0005533658.novalocal python3[5815]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:10 np0005533658.novalocal python3[5839]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:10 np0005533658.novalocal python3[5863]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:11 np0005533658.novalocal python3[5887]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:11 np0005533658.novalocal python3[5911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:11 np0005533658.novalocal python3[5935]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:11 np0005533658.novalocal python3[5959]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:12 np0005533658.novalocal python3[5983]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:12 np0005533658.novalocal python3[6007]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:12 np0005533658.novalocal python3[6031]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:43:15 np0005533658.novalocal sudo[6055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qczpknkpdjnavrzbrodjuxnwlqqvebgh ; /usr/bin/python3'
Nov 24 13:43:15 np0005533658.novalocal sudo[6055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:15 np0005533658.novalocal python3[6057]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 24 13:43:15 np0005533658.novalocal systemd[1]: Starting Time & Date Service...
Nov 24 13:43:15 np0005533658.novalocal systemd[1]: Started Time & Date Service.
Nov 24 13:43:15 np0005533658.novalocal systemd-timedated[6059]: Changed time zone to 'UTC' (UTC).
Nov 24 13:43:15 np0005533658.novalocal sudo[6055]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:15 np0005533658.novalocal sudo[6086]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taiwoqovjaeahflllmgapafjbsmnhpmj ; /usr/bin/python3'
Nov 24 13:43:15 np0005533658.novalocal sudo[6086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:15 np0005533658.novalocal python3[6088]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:15 np0005533658.novalocal sudo[6086]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:16 np0005533658.novalocal python3[6164]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:43:16 np0005533658.novalocal python3[6235]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1763991796.0537217-153-225686046062515/source _original_basename=tmp_i28n909 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:17 np0005533658.novalocal python3[6335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:43:17 np0005533658.novalocal python3[6406]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763991797.015404-183-88630700538120/source _original_basename=tmp7kwkp617 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:18 np0005533658.novalocal sudo[6506]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezybygjrzaxgsuzldlgttmqvqzhjnqjh ; /usr/bin/python3'
Nov 24 13:43:18 np0005533658.novalocal sudo[6506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:18 np0005533658.novalocal python3[6508]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:43:18 np0005533658.novalocal sudo[6506]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:19 np0005533658.novalocal sudo[6579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpwimkgzosvnwggujnorlnhfwcoxbfhk ; /usr/bin/python3'
Nov 24 13:43:19 np0005533658.novalocal sudo[6579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:19 np0005533658.novalocal python3[6581]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1763991798.55388-231-59521996211607/source _original_basename=tmp93gt9ufh follow=False checksum=6ccb10e811009b8a3fb6665900f19085a6ead209 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:19 np0005533658.novalocal sudo[6579]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:19 np0005533658.novalocal python3[6629]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:43:20 np0005533658.novalocal python3[6655]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:43:20 np0005533658.novalocal sudo[6733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyknaosnwykmrykgmnqdwtdbxltnhacb ; /usr/bin/python3'
Nov 24 13:43:20 np0005533658.novalocal sudo[6733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:20 np0005533658.novalocal python3[6735]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:43:20 np0005533658.novalocal sudo[6733]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:20 np0005533658.novalocal sudo[6806]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brmwkbiwmlohfjsksztklcfxsllxpkln ; /usr/bin/python3'
Nov 24 13:43:20 np0005533658.novalocal sudo[6806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:21 np0005533658.novalocal python3[6808]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1763991800.3506153-273-163633331935165/source _original_basename=tmpusipbzmx follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:21 np0005533658.novalocal sudo[6806]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:21 np0005533658.novalocal sudo[6857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aelhwhvpucogtntqenhkimvpuoetjrcw ; /usr/bin/python3'
Nov 24 13:43:21 np0005533658.novalocal sudo[6857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:21 np0005533658.novalocal python3[6859]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-d5a8-38d0-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:43:21 np0005533658.novalocal sudo[6857]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:22 np0005533658.novalocal python3[6887]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-d5a8-38d0-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 24 13:43:23 np0005533658.novalocal python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:39 np0005533658.novalocal sudo[6939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utpynvzaukwjzkkggsqjtzirgnorjhby ; /usr/bin/python3'
Nov 24 13:43:39 np0005533658.novalocal sudo[6939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:43:40 np0005533658.novalocal python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:43:40 np0005533658.novalocal sudo[6939]: pam_unix(sudo:session): session closed for user root
Nov 24 13:43:45 np0005533658.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 24 13:44:14 np0005533658.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 24 13:44:14 np0005533658.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5734] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 13:44:14 np0005533658.novalocal systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5953] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5975] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5979] device (eth1): carrier: link connected
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5981] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5987] policy: auto-activating connection 'Wired connection 1' (c0f7c555-02e9-39b7-af48-41e7866f30b4)
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5990] device (eth1): Activation: starting connection 'Wired connection 1' (c0f7c555-02e9-39b7-af48-41e7866f30b4)
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5991] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5994] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.5997] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 13:44:14 np0005533658.novalocal NetworkManager[858]: <info>  [1763991854.6001] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 13:44:15 np0005533658.novalocal python3[6971]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-e5d3-2605-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:44:22 np0005533658.novalocal sudo[7049]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-folijemhifivmdmmdnpjbcquekaejmeh ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 13:44:22 np0005533658.novalocal sudo[7049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:44:22 np0005533658.novalocal python3[7051]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:44:22 np0005533658.novalocal sudo[7049]: pam_unix(sudo:session): session closed for user root
Nov 24 13:44:22 np0005533658.novalocal sudo[7122]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bneigdcwguqzumagszyzbpnqmwgunrlg ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 13:44:22 np0005533658.novalocal sudo[7122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:44:23 np0005533658.novalocal python3[7124]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763991862.3943498-102-229402039162288/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=ea79d5cbcc435c73025d3468a4e962b3c78985d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:44:23 np0005533658.novalocal sudo[7122]: pam_unix(sudo:session): session closed for user root
Nov 24 13:44:23 np0005533658.novalocal sudo[7172]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbuhbizczhvviwvuhceqclbstzobnben ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 13:44:23 np0005533658.novalocal sudo[7172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:44:23 np0005533658.novalocal python3[7174]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: Stopped Network Manager Wait Online.
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: Stopping Network Manager Wait Online...
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9408] caught SIGTERM, shutting down normally.
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: Stopping Network Manager...
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9424] dhcp4 (eth0): canceled DHCP transaction
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9424] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9424] dhcp4 (eth0): state changed no lease
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9427] manager: NetworkManager state is now CONNECTING
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9638] dhcp4 (eth1): canceled DHCP transaction
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9639] dhcp4 (eth1): state changed no lease
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 13:44:23 np0005533658.novalocal NetworkManager[858]: <info>  [1763991863.9694] exiting (success)
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 24 13:44:23 np0005533658.novalocal systemd[1]: Stopped Network Manager.
Nov 24 13:44:24 np0005533658.novalocal systemd[1]: Starting Network Manager...
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.0383] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:07f12fa4-da71-4102-93d7-808aadc1fc71)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.0385] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.0445] manager[0x55acb72ce070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 13:44:24 np0005533658.novalocal systemd[1]: Starting Hostname Service...
Nov 24 13:44:24 np0005533658.novalocal systemd[1]: Started Hostname Service.
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1500] hostname: hostname: using hostnamed
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1501] hostname: static hostname changed from (none) to "np0005533658.novalocal"
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1506] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1511] manager[0x55acb72ce070]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1512] manager[0x55acb72ce070]: rfkill: WWAN hardware radio set enabled
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1542] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1542] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1543] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1543] manager: Networking is enabled by state file
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1546] settings: Loaded settings plugin: keyfile (internal)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1549] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1574] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1582] dhcp: init: Using DHCP client 'internal'
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1585] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1590] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1594] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1601] device (lo): Activation: starting connection 'lo' (e622f3c8-2558-4f9f-886b-8e216d717dfd)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1607] device (eth0): carrier: link connected
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1611] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1616] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1617] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1622] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1628] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1633] device (eth1): carrier: link connected
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1637] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1642] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c0f7c555-02e9-39b7-af48-41e7866f30b4) (indicated)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1642] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1647] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1652] device (eth1): Activation: starting connection 'Wired connection 1' (c0f7c555-02e9-39b7-af48-41e7866f30b4)
Nov 24 13:44:24 np0005533658.novalocal systemd[1]: Started Network Manager.
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1658] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1662] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1663] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1665] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1667] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1669] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1671] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1673] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1675] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1681] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1683] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1706] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1709] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1727] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1732] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1738] device (lo): Activation: successful, device activated.
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1745] dhcp4 (eth0): state changed new lease, address=38.102.83.214
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1751] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 13:44:24 np0005533658.novalocal systemd[1]: Starting Network Manager Wait Online...
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1825] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1880] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1882] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1885] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1891] device (eth0): Activation: successful, device activated.
Nov 24 13:44:24 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991864.1896] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 13:44:24 np0005533658.novalocal sudo[7172]: pam_unix(sudo:session): session closed for user root
Nov 24 13:44:24 np0005533658.novalocal python3[7259]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-e5d3-2605-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:44:34 np0005533658.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 13:44:54 np0005533658.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 13:45:03 np0005533658.novalocal systemd[4300]: Starting Mark boot as successful...
Nov 24 13:45:03 np0005533658.novalocal systemd[4300]: Finished Mark boot as successful.
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5111] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 13:45:09 np0005533658.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 13:45:09 np0005533658.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5391] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5396] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5412] device (eth1): Activation: successful, device activated.
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5424] manager: startup complete
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5427] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <warn>  [1763991909.5438] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5453] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal systemd[1]: Finished Network Manager Wait Online.
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5568] dhcp4 (eth1): canceled DHCP transaction
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5568] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5569] dhcp4 (eth1): state changed no lease
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5586] policy: auto-activating connection 'ci-private-network' (7049c056-21f5-55fc-906e-9890c70fc7c7)
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5591] device (eth1): Activation: starting connection 'ci-private-network' (7049c056-21f5-55fc-906e-9890c70fc7c7)
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5592] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5595] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5604] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5613] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5651] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5655] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 13:45:09 np0005533658.novalocal NetworkManager[7187]: <info>  [1763991909.5665] device (eth1): Activation: successful, device activated.
Nov 24 13:45:19 np0005533658.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 13:45:23 np0005533658.novalocal sshd-session[4309]: Received disconnect from 38.102.83.114 port 48004:11: disconnected by user
Nov 24 13:45:23 np0005533658.novalocal sshd-session[4309]: Disconnected from user zuul 38.102.83.114 port 48004
Nov 24 13:45:23 np0005533658.novalocal sshd-session[4296]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:45:23 np0005533658.novalocal systemd-logind[807]: Session 1 logged out. Waiting for processes to exit.
Nov 24 13:45:24 np0005533658.novalocal sshd-session[7288]: Accepted publickey for zuul from 38.102.83.114 port 52774 ssh2: RSA SHA256:StdIAygMGQUV11+A5F2zWmMmSbeKNeh5/FhmIBCqVH0
Nov 24 13:45:24 np0005533658.novalocal systemd-logind[807]: New session 3 of user zuul.
Nov 24 13:45:24 np0005533658.novalocal systemd[1]: Started Session 3 of User zuul.
Nov 24 13:45:24 np0005533658.novalocal sshd-session[7288]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:45:24 np0005533658.novalocal sudo[7368]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntvmdhpfgfmunxlnpmdypemerjwwmoyb ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 13:45:24 np0005533658.novalocal sudo[7368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:45:24 np0005533658.novalocal python3[7370]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:45:24 np0005533658.novalocal sudo[7368]: pam_unix(sudo:session): session closed for user root
Nov 24 13:45:24 np0005533658.novalocal sudo[7441]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdapwjqtpvzodmrsnqwaxyqbhozvfvjx ; OS_CLOUD=vexxhost /usr/bin/python3'
Nov 24 13:45:24 np0005533658.novalocal sudo[7441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:45:24 np0005533658.novalocal python3[7443]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/ansible-tmp-1763991924.1319485-259-198011357612632/source _original_basename=tmps4mfhuc6 follow=False checksum=87585dae9b64bccef01ffe3ab111b40449e44646 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:45:24 np0005533658.novalocal sudo[7441]: pam_unix(sudo:session): session closed for user root
Nov 24 13:45:27 np0005533658.novalocal sshd-session[7291]: Connection closed by 38.102.83.114 port 52774
Nov 24 13:45:27 np0005533658.novalocal sshd-session[7288]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:45:27 np0005533658.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Nov 24 13:45:27 np0005533658.novalocal systemd-logind[807]: Session 3 logged out. Waiting for processes to exit.
Nov 24 13:45:27 np0005533658.novalocal systemd-logind[807]: Removed session 3.
Nov 24 13:46:15 np0005533658.novalocal sshd-session[7468]: Connection closed by authenticating user root 80.94.95.116 port 47246 [preauth]
Nov 24 13:48:03 np0005533658.novalocal systemd[4300]: Created slice User Background Tasks Slice.
Nov 24 13:48:03 np0005533658.novalocal systemd[4300]: Starting Cleanup of User's Temporary Files and Directories...
Nov 24 13:48:03 np0005533658.novalocal systemd[4300]: Finished Cleanup of User's Temporary Files and Directories.
Nov 24 13:50:21 np0005533658.novalocal sshd-session[7477]: Accepted publickey for zuul from 38.102.83.114 port 42036 ssh2: RSA SHA256:StdIAygMGQUV11+A5F2zWmMmSbeKNeh5/FhmIBCqVH0
Nov 24 13:50:21 np0005533658.novalocal systemd-logind[807]: New session 4 of user zuul.
Nov 24 13:50:21 np0005533658.novalocal systemd[1]: Started Session 4 of User zuul.
Nov 24 13:50:21 np0005533658.novalocal sshd-session[7477]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:50:21 np0005533658.novalocal sudo[7504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peeckpwbpnjtshhxhymgmhsnprrfyotk ; /usr/bin/python3'
Nov 24 13:50:21 np0005533658.novalocal sudo[7504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:21 np0005533658.novalocal python3[7506]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-2c54-7796-000000001cc4-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:50:21 np0005533658.novalocal sudo[7504]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:21 np0005533658.novalocal sudo[7533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwghcggxbydrlqeqwpfbesndvsyufjfr ; /usr/bin/python3'
Nov 24 13:50:21 np0005533658.novalocal sudo[7533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:21 np0005533658.novalocal python3[7535]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:50:21 np0005533658.novalocal sudo[7533]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:21 np0005533658.novalocal sudo[7559]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bprrgwmvjpeosyjqivkimtboiafhkzbh ; /usr/bin/python3'
Nov 24 13:50:21 np0005533658.novalocal sudo[7559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:22 np0005533658.novalocal python3[7561]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:50:22 np0005533658.novalocal sudo[7559]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:22 np0005533658.novalocal sudo[7585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tylphcjqijzwocdrhmoiyyilijfppkzx ; /usr/bin/python3'
Nov 24 13:50:22 np0005533658.novalocal sudo[7585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:22 np0005533658.novalocal python3[7587]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:50:22 np0005533658.novalocal sudo[7585]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:22 np0005533658.novalocal sudo[7611]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmupipnkmsnhfypfsiuytmswsxsiwjny ; /usr/bin/python3'
Nov 24 13:50:22 np0005533658.novalocal sudo[7611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:22 np0005533658.novalocal python3[7613]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:50:22 np0005533658.novalocal sudo[7611]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:23 np0005533658.novalocal sudo[7637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzbxikfcehssarxnpdklfuoymnjzjesq ; /usr/bin/python3'
Nov 24 13:50:23 np0005533658.novalocal sudo[7637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:23 np0005533658.novalocal python3[7639]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:50:23 np0005533658.novalocal sudo[7637]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:23 np0005533658.novalocal sudo[7715]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgtkmtftarznspyiwyxqjmpfeyakgjoq ; /usr/bin/python3'
Nov 24 13:50:23 np0005533658.novalocal sudo[7715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:23 np0005533658.novalocal python3[7717]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:50:23 np0005533658.novalocal sudo[7715]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:24 np0005533658.novalocal sudo[7788]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvbxeshmxbewwfutrebsoiwwzssrlyhg ; /usr/bin/python3'
Nov 24 13:50:24 np0005533658.novalocal sudo[7788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:24 np0005533658.novalocal python3[7790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763992223.5438395-469-197213763700722/source _original_basename=tmp1um_j48k follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:50:24 np0005533658.novalocal sudo[7788]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:24 np0005533658.novalocal sudo[7838]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwzczxeiwtizidbkrknijtsubodmlphu ; /usr/bin/python3'
Nov 24 13:50:24 np0005533658.novalocal sudo[7838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:25 np0005533658.novalocal python3[7840]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 13:50:25 np0005533658.novalocal systemd[1]: Reloading.
Nov 24 13:50:25 np0005533658.novalocal systemd-rc-local-generator[7859]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:50:25 np0005533658.novalocal sudo[7838]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:26 np0005533658.novalocal sudo[7894]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iaviytiqwnugxqlqrrrfvyawsancpasy ; /usr/bin/python3'
Nov 24 13:50:26 np0005533658.novalocal sudo[7894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:26 np0005533658.novalocal python3[7896]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 24 13:50:26 np0005533658.novalocal sudo[7894]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:27 np0005533658.novalocal sudo[7920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlrimewbroijxydhkmfqufenaernjpcq ; /usr/bin/python3'
Nov 24 13:50:27 np0005533658.novalocal sudo[7920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:27 np0005533658.novalocal python3[7922]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:50:27 np0005533658.novalocal sudo[7920]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:27 np0005533658.novalocal sudo[7948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfqcisodmnojgdhqzzsoilzkcdjrtnbw ; /usr/bin/python3'
Nov 24 13:50:27 np0005533658.novalocal sudo[7948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:27 np0005533658.novalocal python3[7950]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:50:27 np0005533658.novalocal sudo[7948]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:27 np0005533658.novalocal sudo[7976]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqcpckqdstjkezbbuhktklomlucrbhjr ; /usr/bin/python3'
Nov 24 13:50:27 np0005533658.novalocal sudo[7976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:27 np0005533658.novalocal python3[7978]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:50:27 np0005533658.novalocal sudo[7976]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:27 np0005533658.novalocal sudo[8004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qathbpxqmnkbjzqathsrgbpvvbseybpo ; /usr/bin/python3'
Nov 24 13:50:27 np0005533658.novalocal sudo[8004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:28 np0005533658.novalocal python3[8006]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:50:28 np0005533658.novalocal sudo[8004]: pam_unix(sudo:session): session closed for user root
Nov 24 13:50:28 np0005533658.novalocal python3[8033]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-2c54-7796-000000001ccb-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:50:29 np0005533658.novalocal python3[8063]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 24 13:50:31 np0005533658.novalocal sshd-session[7480]: Connection closed by 38.102.83.114 port 42036
Nov 24 13:50:31 np0005533658.novalocal sshd-session[7477]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:50:31 np0005533658.novalocal systemd-logind[807]: Session 4 logged out. Waiting for processes to exit.
Nov 24 13:50:31 np0005533658.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Nov 24 13:50:31 np0005533658.novalocal systemd[1]: session-4.scope: Consumed 3.969s CPU time.
Nov 24 13:50:31 np0005533658.novalocal systemd-logind[807]: Removed session 4.
Nov 24 13:50:32 np0005533658.novalocal sshd-session[8068]: Accepted publickey for zuul from 38.102.83.114 port 50382 ssh2: RSA SHA256:StdIAygMGQUV11+A5F2zWmMmSbeKNeh5/FhmIBCqVH0
Nov 24 13:50:32 np0005533658.novalocal systemd-logind[807]: New session 5 of user zuul.
Nov 24 13:50:32 np0005533658.novalocal systemd[1]: Started Session 5 of User zuul.
Nov 24 13:50:32 np0005533658.novalocal sshd-session[8068]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:50:32 np0005533658.novalocal sudo[8095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-burttinzoqsfyxjgxkulrxjgxrwvjfnm ; /usr/bin/python3'
Nov 24 13:50:32 np0005533658.novalocal sudo[8095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:50:32 np0005533658.novalocal python3[8097]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:50:47 np0005533658.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:50:56 np0005533658.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  Converting 385 SID table entries...
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:51:05 np0005533658.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:51:06 np0005533658.novalocal setsebool[8166]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 24 13:51:06 np0005533658.novalocal setsebool[8166]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  Converting 388 SID table entries...
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability open_perms=1
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability always_check_network=0
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 13:51:17 np0005533658.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 13:51:35 np0005533658.novalocal dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 24 13:51:35 np0005533658.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 13:51:35 np0005533658.novalocal systemd[1]: Starting man-db-cache-update.service...
Nov 24 13:51:35 np0005533658.novalocal systemd[1]: Reloading.
Nov 24 13:51:36 np0005533658.novalocal systemd-rc-local-generator[8920]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 13:51:36 np0005533658.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 13:51:37 np0005533658.novalocal sudo[8095]: pam_unix(sudo:session): session closed for user root
Nov 24 13:51:45 np0005533658.novalocal python3[15245]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-8c4e-17c5-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:51:46 np0005533658.novalocal kernel: evm: overlay not supported
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: Starting D-Bus User Message Bus...
Nov 24 13:51:46 np0005533658.novalocal dbus-broker-launch[15787]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 24 13:51:46 np0005533658.novalocal dbus-broker-launch[15787]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: Started D-Bus User Message Bus.
Nov 24 13:51:46 np0005533658.novalocal dbus-broker-lau[15787]: Ready
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: Created slice Slice /user.
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: podman-15708.scope: unit configures an IP firewall, but not running as root.
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: (This warning is only shown for the first unit using IP firewalling.)
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: Started podman-15708.scope.
Nov 24 13:51:46 np0005533658.novalocal systemd[4300]: Started podman-pause-78497fa9.scope.
Nov 24 13:51:47 np0005533658.novalocal sudo[16262]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unvzkuuzfeeumyzwhurcmbasxnhyejhp ; /usr/bin/python3'
Nov 24 13:51:47 np0005533658.novalocal sudo[16262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:51:47 np0005533658.novalocal python3[16274]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.163:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.163:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:51:47 np0005533658.novalocal python3[16274]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 24 13:51:47 np0005533658.novalocal sudo[16262]: pam_unix(sudo:session): session closed for user root
Nov 24 13:51:48 np0005533658.novalocal sshd-session[8071]: Connection closed by 38.102.83.114 port 50382
Nov 24 13:51:48 np0005533658.novalocal sshd-session[8068]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:51:48 np0005533658.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Nov 24 13:51:48 np0005533658.novalocal systemd[1]: session-5.scope: Consumed 59.741s CPU time.
Nov 24 13:51:48 np0005533658.novalocal systemd-logind[807]: Session 5 logged out. Waiting for processes to exit.
Nov 24 13:51:48 np0005533658.novalocal systemd-logind[807]: Removed session 5.
Nov 24 13:51:59 np0005533658.novalocal irqbalance[784]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 24 13:51:59 np0005533658.novalocal irqbalance[784]: IRQ 27 affinity is now unmanaged
Nov 24 13:52:07 np0005533658.novalocal sshd-session[24857]: Connection closed by 38.102.83.47 port 45112 [preauth]
Nov 24 13:52:07 np0005533658.novalocal sshd-session[24863]: Connection closed by 38.102.83.47 port 45128 [preauth]
Nov 24 13:52:07 np0005533658.novalocal sshd-session[24866]: Unable to negotiate with 38.102.83.47 port 45138: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 24 13:52:07 np0005533658.novalocal sshd-session[24858]: Unable to negotiate with 38.102.83.47 port 45140: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 24 13:52:07 np0005533658.novalocal sshd-session[24860]: Unable to negotiate with 38.102.83.47 port 45152: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 24 13:52:12 np0005533658.novalocal sshd-session[26840]: Accepted publickey for zuul from 38.102.83.114 port 32966 ssh2: RSA SHA256:StdIAygMGQUV11+A5F2zWmMmSbeKNeh5/FhmIBCqVH0
Nov 24 13:52:12 np0005533658.novalocal systemd-logind[807]: New session 6 of user zuul.
Nov 24 13:52:12 np0005533658.novalocal systemd[1]: Started Session 6 of User zuul.
Nov 24 13:52:12 np0005533658.novalocal sshd-session[26840]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:52:12 np0005533658.novalocal python3[26907]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLayQXNiiamh61b2GAxoH6SmvHvb9WUIMKaRmEKs95oq4P4Ig9YnBipVYTckk+iqTZdiNXczO5lJP5fb0t8Y6YU= zuul@np0005533657.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:52:12 np0005533658.novalocal sudo[27075]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shptqghcpntuawnbktrtrahtxlmvcrac ; /usr/bin/python3'
Nov 24 13:52:12 np0005533658.novalocal sudo[27075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:52:12 np0005533658.novalocal python3[27086]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLayQXNiiamh61b2GAxoH6SmvHvb9WUIMKaRmEKs95oq4P4Ig9YnBipVYTckk+iqTZdiNXczO5lJP5fb0t8Y6YU= zuul@np0005533657.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:52:12 np0005533658.novalocal sudo[27075]: pam_unix(sudo:session): session closed for user root
Nov 24 13:52:13 np0005533658.novalocal sudo[27499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyqdotdevxkmfnazxoosrpugcyodlegf ; /usr/bin/python3'
Nov 24 13:52:13 np0005533658.novalocal sudo[27499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:52:13 np0005533658.novalocal python3[27509]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005533658.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 24 13:52:13 np0005533658.novalocal useradd[27596]: new group: name=cloud-admin, GID=1002
Nov 24 13:52:13 np0005533658.novalocal useradd[27596]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Nov 24 13:52:13 np0005533658.novalocal sudo[27499]: pam_unix(sudo:session): session closed for user root
Nov 24 13:52:13 np0005533658.novalocal sudo[27760]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbemvjmeyexqryzvoaefvtckyzdxyazw ; /usr/bin/python3'
Nov 24 13:52:13 np0005533658.novalocal sudo[27760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:52:14 np0005533658.novalocal python3[27772]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLayQXNiiamh61b2GAxoH6SmvHvb9WUIMKaRmEKs95oq4P4Ig9YnBipVYTckk+iqTZdiNXczO5lJP5fb0t8Y6YU= zuul@np0005533657.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 24 13:52:14 np0005533658.novalocal sudo[27760]: pam_unix(sudo:session): session closed for user root
Nov 24 13:52:14 np0005533658.novalocal sudo[28021]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iywnhnsftqpikonsymhedvgcgwmrfaxj ; /usr/bin/python3'
Nov 24 13:52:14 np0005533658.novalocal sudo[28021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:52:14 np0005533658.novalocal python3[28031]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:52:14 np0005533658.novalocal sudo[28021]: pam_unix(sudo:session): session closed for user root
Nov 24 13:52:14 np0005533658.novalocal sudo[28311]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iemqmllljkqlwdkrkjhqbxikyrbcmzyr ; /usr/bin/python3'
Nov 24 13:52:14 np0005533658.novalocal sudo[28311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:52:15 np0005533658.novalocal python3[28320]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763992334.2899315-135-168172983360433/source _original_basename=tmp4d9let4k follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:52:15 np0005533658.novalocal sudo[28311]: pam_unix(sudo:session): session closed for user root
Nov 24 13:52:15 np0005533658.novalocal sudo[28655]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvpyetohmymxdbcswtzrvmcqlcrtfqmu ; /usr/bin/python3'
Nov 24 13:52:15 np0005533658.novalocal sudo[28655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:52:15 np0005533658.novalocal python3[28666]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Nov 24 13:52:15 np0005533658.novalocal systemd[1]: Starting Hostname Service...
Nov 24 13:52:15 np0005533658.novalocal systemd[1]: Started Hostname Service.
Nov 24 13:52:15 np0005533658.novalocal systemd-hostnamed[28811]: Changed pretty hostname to 'compute-0'
Nov 24 13:52:15 compute-0 systemd-hostnamed[28811]: Hostname set to <compute-0> (static)
Nov 24 13:52:15 compute-0 NetworkManager[7187]: <info>  [1763992335.8474] hostname: static hostname changed from "np0005533658.novalocal" to "compute-0"
Nov 24 13:52:15 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 13:52:15 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 13:52:15 compute-0 sudo[28655]: pam_unix(sudo:session): session closed for user root
Nov 24 13:52:16 compute-0 sshd-session[26851]: Connection closed by 38.102.83.114 port 32966
Nov 24 13:52:16 compute-0 sshd-session[26840]: pam_unix(sshd:session): session closed for user zuul
Nov 24 13:52:16 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Nov 24 13:52:16 compute-0 systemd[1]: session-6.scope: Consumed 2.016s CPU time.
Nov 24 13:52:16 compute-0 systemd-logind[807]: Session 6 logged out. Waiting for processes to exit.
Nov 24 13:52:16 compute-0 systemd-logind[807]: Removed session 6.
Nov 24 13:52:18 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 13:52:18 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 13:52:18 compute-0 systemd[1]: man-db-cache-update.service: Consumed 50.012s CPU time.
Nov 24 13:52:18 compute-0 systemd[1]: run-r970c12b969314a2faa5c2988251f687e.service: Deactivated successfully.
Nov 24 13:52:25 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 13:52:45 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 13:55:36 compute-0 sshd-session[29928]: banner exchange: Connection from 20.65.193.105 port 45878: invalid format
Nov 24 13:55:46 compute-0 sshd-session[29926]: Connection closed by 20.65.193.105 port 45870 [preauth]
Nov 24 13:55:50 compute-0 sshd-session[29929]: Accepted publickey for zuul from 38.102.83.47 port 44440 ssh2: RSA SHA256:StdIAygMGQUV11+A5F2zWmMmSbeKNeh5/FhmIBCqVH0
Nov 24 13:55:50 compute-0 systemd-logind[807]: New session 7 of user zuul.
Nov 24 13:55:50 compute-0 systemd[1]: Started Session 7 of User zuul.
Nov 24 13:55:50 compute-0 sshd-session[29929]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 13:55:50 compute-0 python3[30005]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 13:55:51 compute-0 sudo[30119]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtqkbvblhrvuuaiyfvhrjmzpbyyvemaq ; /usr/bin/python3'
Nov 24 13:55:51 compute-0 sudo[30119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:52 compute-0 python3[30121]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:52 compute-0 sudo[30119]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:52 compute-0 sudo[30192]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-likpvvvlvceokelxcqrtybuhwcgwyvwv ; /usr/bin/python3'
Nov 24 13:55:52 compute-0 sudo[30192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:52 compute-0 python3[30194]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=delorean.repo follow=False checksum=8aa5769cfeddc0c9144a7a1fe8cb4242536b164f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:52 compute-0 sudo[30192]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:52 compute-0 sudo[30218]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntfalttyyuupirlsovveuyktquxewsgq ; /usr/bin/python3'
Nov 24 13:55:52 compute-0 sudo[30218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:52 compute-0 python3[30220]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:52 compute-0 sudo[30218]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:53 compute-0 sudo[30291]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjdanyfedjxqsbswrlemzcyqglyhjpgu ; /usr/bin/python3'
Nov 24 13:55:53 compute-0 sudo[30291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:53 compute-0 python3[30293]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:53 compute-0 sudo[30291]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:53 compute-0 sudo[30317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktdibdjomuyisojrycljeheppdoportv ; /usr/bin/python3'
Nov 24 13:55:53 compute-0 sudo[30317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:53 compute-0 python3[30319]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:53 compute-0 sudo[30317]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:53 compute-0 sudo[30390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siqywdzdfinazmqaerdsrzdrpehaltmx ; /usr/bin/python3'
Nov 24 13:55:53 compute-0 sudo[30390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:53 compute-0 python3[30392]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:53 compute-0 sudo[30390]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:53 compute-0 sudo[30416]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqcojzdwgbybysnyfyzgbujckttdjuxn ; /usr/bin/python3'
Nov 24 13:55:53 compute-0 sudo[30416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:53 compute-0 python3[30418]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:53 compute-0 sudo[30416]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:54 compute-0 sudo[30489]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnigctntlyvonkqspqgnusxkimhbbroy ; /usr/bin/python3'
Nov 24 13:55:54 compute-0 sudo[30489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:54 compute-0 python3[30491]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:54 compute-0 sudo[30489]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:54 compute-0 sudo[30515]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhjarjhdxgyvzukuuoyekhzltgzzywvj ; /usr/bin/python3'
Nov 24 13:55:54 compute-0 sudo[30515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:54 compute-0 python3[30517]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:54 compute-0 sudo[30515]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:54 compute-0 sudo[30588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qapktiefwxeimawvgtirbltzliijlyjz ; /usr/bin/python3'
Nov 24 13:55:54 compute-0 sudo[30588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:54 compute-0 python3[30590]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:54 compute-0 sudo[30588]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:54 compute-0 sudo[30614]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkblrmumdzmurtljvnvfxncrzhrxwnyr ; /usr/bin/python3'
Nov 24 13:55:54 compute-0 sudo[30614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:55 compute-0 python3[30616]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:55 compute-0 sudo[30614]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:55 compute-0 sudo[30687]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcuhoxsvxcbctjnawkzhrtbqphsskowb ; /usr/bin/python3'
Nov 24 13:55:55 compute-0 sudo[30687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:55 compute-0 python3[30689]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:55 compute-0 sudo[30687]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:55 compute-0 sudo[30713]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osuwqjqlglmokyoseywttbnnxgiknijp ; /usr/bin/python3'
Nov 24 13:55:55 compute-0 sudo[30713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:55 compute-0 python3[30715]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:55 compute-0 sudo[30713]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:55 compute-0 sudo[30786]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynlogafzhnsmneenzgsscxxuawrfgize ; /usr/bin/python3'
Nov 24 13:55:55 compute-0 sudo[30786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:55 compute-0 python3[30788]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6646317362318a9831d66a1804f6bb7dd1b97cd5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:55 compute-0 sudo[30786]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:56 compute-0 sudo[30812]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jthzegqncfkcuxylfmyqkxieksplbbcv ; /usr/bin/python3'
Nov 24 13:55:56 compute-0 sudo[30812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:56 compute-0 python3[30814]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 24 13:55:56 compute-0 sudo[30812]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:56 compute-0 sudo[30885]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qswfyeoqjmzfhpnxqatrwhqlxndeuiyl ; /usr/bin/python3'
Nov 24 13:55:56 compute-0 sudo[30885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 13:55:56 compute-0 python3[30887]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1763992551.802155-33574-126407358225904/source mode=0755 _original_basename=gating.repo follow=False checksum=6441df02a8d4ef2101898dc52c4564aae2f8faa7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 13:55:56 compute-0 sudo[30885]: pam_unix(sudo:session): session closed for user root
Nov 24 13:55:58 compute-0 sshd-session[30913]: Connection closed by 192.168.122.11 port 58110 [preauth]
Nov 24 13:55:58 compute-0 sshd-session[30912]: Connection closed by 192.168.122.11 port 58102 [preauth]
Nov 24 13:55:58 compute-0 sshd-session[30914]: Unable to negotiate with 192.168.122.11 port 58122: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Nov 24 13:55:58 compute-0 sshd-session[30915]: Unable to negotiate with 192.168.122.11 port 58128: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Nov 24 13:55:58 compute-0 sshd-session[30916]: Unable to negotiate with 192.168.122.11 port 58138: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Nov 24 13:56:07 compute-0 python3[30945]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 13:56:38 compute-0 sshd-session[30947]: Invalid user git from 80.94.95.115 port 25000
Nov 24 13:56:38 compute-0 sshd-session[30947]: Connection closed by invalid user git 80.94.95.115 port 25000 [preauth]
Nov 24 13:57:53 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 24 13:57:53 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 24 13:57:53 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 24 13:57:53 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 24 14:01:01 compute-0 CROND[30955]: (root) CMD (run-parts /etc/cron.hourly)
Nov 24 14:01:01 compute-0 run-parts[30958]: (/etc/cron.hourly) starting 0anacron
Nov 24 14:01:01 compute-0 anacron[30966]: Anacron started on 2025-11-24
Nov 24 14:01:01 compute-0 anacron[30966]: Will run job `cron.daily' in 44 min.
Nov 24 14:01:01 compute-0 anacron[30966]: Will run job `cron.weekly' in 64 min.
Nov 24 14:01:01 compute-0 anacron[30966]: Will run job `cron.monthly' in 84 min.
Nov 24 14:01:01 compute-0 anacron[30966]: Jobs will be executed sequentially
Nov 24 14:01:01 compute-0 run-parts[30968]: (/etc/cron.hourly) finished 0anacron
Nov 24 14:01:01 compute-0 CROND[30954]: (root) CMDEND (run-parts /etc/cron.hourly)
Nov 24 14:01:07 compute-0 sshd-session[29932]: Received disconnect from 38.102.83.47 port 44440:11: disconnected by user
Nov 24 14:01:07 compute-0 sshd-session[29932]: Disconnected from user zuul 38.102.83.47 port 44440
Nov 24 14:01:07 compute-0 sshd-session[29929]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:01:07 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Nov 24 14:01:07 compute-0 systemd[1]: session-7.scope: Consumed 5.129s CPU time.
Nov 24 14:01:07 compute-0 systemd-logind[807]: Session 7 logged out. Waiting for processes to exit.
Nov 24 14:01:07 compute-0 systemd-logind[807]: Removed session 7.
Nov 24 14:03:43 compute-0 sshd-session[30970]: Connection closed by authenticating user root 80.94.95.115 port 45182 [preauth]
Nov 24 14:06:44 compute-0 sshd-session[30973]: Accepted publickey for zuul from 192.168.122.30 port 37110 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:06:44 compute-0 systemd-logind[807]: New session 8 of user zuul.
Nov 24 14:06:44 compute-0 systemd[1]: Started Session 8 of User zuul.
Nov 24 14:06:44 compute-0 sshd-session[30973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:06:45 compute-0 python3.9[31126]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:06:46 compute-0 sudo[31305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysjitdvbqxprrpaqupcrcpkghiogsnvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993206.5577292-32-65852164075448/AnsiballZ_command.py'
Nov 24 14:06:46 compute-0 sudo[31305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:06:47 compute-0 python3.9[31307]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:06:53 compute-0 sudo[31305]: pam_unix(sudo:session): session closed for user root
Nov 24 14:06:54 compute-0 sshd-session[30976]: Connection closed by 192.168.122.30 port 37110
Nov 24 14:06:54 compute-0 sshd-session[30973]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:06:54 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Nov 24 14:06:54 compute-0 systemd[1]: session-8.scope: Consumed 7.506s CPU time.
Nov 24 14:06:54 compute-0 systemd-logind[807]: Session 8 logged out. Waiting for processes to exit.
Nov 24 14:06:54 compute-0 systemd-logind[807]: Removed session 8.
Nov 24 14:06:59 compute-0 sshd-session[31364]: Accepted publickey for zuul from 192.168.122.30 port 41852 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:06:59 compute-0 systemd-logind[807]: New session 9 of user zuul.
Nov 24 14:06:59 compute-0 systemd[1]: Started Session 9 of User zuul.
Nov 24 14:06:59 compute-0 sshd-session[31364]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:07:00 compute-0 python3.9[31517]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:07:00 compute-0 sshd-session[31367]: Connection closed by 192.168.122.30 port 41852
Nov 24 14:07:00 compute-0 sshd-session[31364]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:07:00 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Nov 24 14:07:00 compute-0 systemd-logind[807]: Session 9 logged out. Waiting for processes to exit.
Nov 24 14:07:00 compute-0 systemd-logind[807]: Removed session 9.
Nov 24 14:07:16 compute-0 sshd-session[31545]: Accepted publickey for zuul from 192.168.122.30 port 36604 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:07:16 compute-0 systemd-logind[807]: New session 10 of user zuul.
Nov 24 14:07:16 compute-0 systemd[1]: Started Session 10 of User zuul.
Nov 24 14:07:16 compute-0 sshd-session[31545]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:07:17 compute-0 python3.9[31698]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 24 14:07:18 compute-0 python3.9[31872]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:07:19 compute-0 sudo[32022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpfnrvpcqsesweivocvoqiudndgkttcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993238.7774768-45-14869638528425/AnsiballZ_command.py'
Nov 24 14:07:19 compute-0 sudo[32022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:19 compute-0 python3.9[32024]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:07:19 compute-0 sudo[32022]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:20 compute-0 sudo[32175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpmnowxhoaipcgbnudzxpcorrodsgyru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993239.663232-57-10284118494552/AnsiballZ_stat.py'
Nov 24 14:07:20 compute-0 sudo[32175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:20 compute-0 python3.9[32177]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:07:20 compute-0 sudo[32175]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:20 compute-0 sudo[32327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmyqackwqpjskcqypvbmnxvzfjwdzsfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993240.4539592-65-92133437351188/AnsiballZ_file.py'
Nov 24 14:07:20 compute-0 sudo[32327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:21 compute-0 python3.9[32329]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:07:21 compute-0 sudo[32327]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:21 compute-0 sudo[32479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwzusgqjoiiigyrbxtjowkiolbnbydhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993241.186065-73-60158815039286/AnsiballZ_stat.py'
Nov 24 14:07:21 compute-0 sudo[32479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:21 compute-0 python3.9[32481]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:07:21 compute-0 sudo[32479]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:22 compute-0 sudo[32602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibkkxjezlualyhhrevvcxbitvdvbfxmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993241.186065-73-60158815039286/AnsiballZ_copy.py'
Nov 24 14:07:22 compute-0 sudo[32602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:22 compute-0 python3.9[32604]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993241.186065-73-60158815039286/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:07:22 compute-0 sudo[32602]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:22 compute-0 sudo[32754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzztelgxivfqlsyssfagwzpzrbewpzib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993242.6907234-88-219292101873151/AnsiballZ_setup.py'
Nov 24 14:07:22 compute-0 sudo[32754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:23 compute-0 python3.9[32756]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:07:23 compute-0 sudo[32754]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:23 compute-0 sudo[32910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evpigqbxswrbcqyaktlxfqxvrqkjelkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993243.585352-96-206369639245807/AnsiballZ_file.py'
Nov 24 14:07:23 compute-0 sudo[32910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:24 compute-0 python3.9[32912]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:07:24 compute-0 sudo[32910]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:24 compute-0 sudo[33062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxgpaivpyykpvothdkxubdhxnnadaixp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993244.2219286-105-45763799360833/AnsiballZ_file.py'
Nov 24 14:07:24 compute-0 sudo[33062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:24 compute-0 python3.9[33064]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:07:24 compute-0 sudo[33062]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:25 compute-0 python3.9[33214]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:07:28 compute-0 python3.9[33467]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:07:29 compute-0 python3.9[33617]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:07:30 compute-0 python3.9[33771]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:07:30 compute-0 sudo[33927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vytsoepsqfijxnjunhoofvokautgpfdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993250.5205228-153-170096667940829/AnsiballZ_setup.py'
Nov 24 14:07:30 compute-0 sudo[33927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:31 compute-0 python3.9[33929]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:07:31 compute-0 sudo[33927]: pam_unix(sudo:session): session closed for user root
Nov 24 14:07:31 compute-0 sudo[34011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnwfaqewspeypkafsjzrejpwzkilfwsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993250.5205228-153-170096667940829/AnsiballZ_dnf.py'
Nov 24 14:07:31 compute-0 sudo[34011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:07:31 compute-0 python3.9[34013]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:08:13 compute-0 systemd[1]: Reloading.
Nov 24 14:08:13 compute-0 systemd-rc-local-generator[34212]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:08:14 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 24 14:08:14 compute-0 systemd[1]: Reloading.
Nov 24 14:08:14 compute-0 systemd-rc-local-generator[34250]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:08:14 compute-0 systemd[1]: Starting dnf makecache...
Nov 24 14:08:14 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 24 14:08:14 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 24 14:08:14 compute-0 systemd[1]: Reloading.
Nov 24 14:08:14 compute-0 systemd-rc-local-generator[34293]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:08:14 compute-0 dnf[34260]: Repository 'gating-repo' is missing name in configuration, using id.
Nov 24 14:08:14 compute-0 dnf[34260]: Failed determining last makecache time.
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-openstack-barbican-42b4c41831408a8e323 145 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 162 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-openstack-cinder-1c00d6490d88e436f26ef 165 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-python-stevedore-c4acc5639fd2329372142 167 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-python-observabilityclient-2f31846d73c 163 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-os-net-config-bbae2ed8a159b0435a473f38 125 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 162 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-python-designate-tests-tempest-347fdbc 152 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-openstack-glance-1fd12c29b339f30fe823e 154 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 157 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-openstack-manila-3c01b7181572c95dac462 147 kB/s | 3.0 kB     00:00
Nov 24 14:08:14 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:08:14 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:08:14 compute-0 dnf[34260]: delorean-python-whitebox-neutron-tests-tempest- 161 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-openstack-octavia-ba397f07a7331190208c 162 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-openstack-watcher-c014f81a8647287f6dcc 162 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-python-tcib-1124124ec06aadbac34f0d340b 164 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 154 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-openstack-swift-dc98a8463506ac520c469a 165 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-python-tempestconf-8515371b7cceebd4282 158 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: delorean-openstack-heat-ui-013accbfd179753bc3f0 166 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: gating-repo                                     497 kB/s | 3.0 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: CentOS Stream 9 - BaseOS                         29 kB/s | 7.3 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: CentOS Stream 9 - AppStream                      30 kB/s | 7.4 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: CentOS Stream 9 - CRB                            80 kB/s | 7.2 kB     00:00
Nov 24 14:08:15 compute-0 dnf[34260]: CentOS Stream 9 - Extras packages                76 kB/s | 8.3 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: dlrn-antelope-testing                           131 kB/s | 3.0 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: dlrn-antelope-build-deps                        140 kB/s | 3.0 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: centos9-rabbitmq                                118 kB/s | 3.0 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: centos9-storage                                 115 kB/s | 3.0 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: centos9-opstools                                 91 kB/s | 3.0 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: NFV SIG OpenvSwitch                              99 kB/s | 3.0 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: repo-setup-centos-appstream                     151 kB/s | 4.4 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: repo-setup-centos-baseos                        161 kB/s | 3.9 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: repo-setup-centos-highavailability              175 kB/s | 3.9 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: repo-setup-centos-powertools                    180 kB/s | 4.3 kB     00:00
Nov 24 14:08:16 compute-0 dnf[34260]: Extra Packages for Enterprise Linux 9 - x86_64  263 kB/s |  32 kB     00:00
Nov 24 14:08:17 compute-0 dnf[34260]: Metadata cache created.
Nov 24 14:08:17 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 24 14:08:17 compute-0 systemd[1]: Finished dnf makecache.
Nov 24 14:08:17 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.720s CPU time.
Nov 24 14:09:08 compute-0 sshd-session[34527]: Connection closed by 71.6.232.29 port 54288 [preauth]
Nov 24 14:09:14 compute-0 kernel: SELinux:  Converting 2718 SID table entries...
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 14:09:14 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 14:09:14 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 24 14:09:14 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:09:14 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:09:14 compute-0 systemd[1]: Reloading.
Nov 24 14:09:14 compute-0 systemd-rc-local-generator[34678]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:09:14 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:09:15 compute-0 sudo[34011]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:15 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:09:15 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:09:15 compute-0 systemd[1]: run-rdeccaeaf3618444fb5ea23d045415d00.service: Deactivated successfully.
Nov 24 14:09:15 compute-0 sudo[35588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jphxbzpiolahtrgdkmsmggdnatmzdejx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993355.5118043-165-119430628674626/AnsiballZ_command.py'
Nov 24 14:09:15 compute-0 sudo[35588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:15 compute-0 python3.9[35591]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:16 compute-0 sudo[35588]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:17 compute-0 sudo[35870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqgdqruodlovruxecstehbhmuxorhhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993357.0315118-173-240156163661544/AnsiballZ_selinux.py'
Nov 24 14:09:17 compute-0 sudo[35870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:17 compute-0 python3.9[35872]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 24 14:09:17 compute-0 sudo[35870]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:18 compute-0 sudo[36022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrszhkdqndhmgnbtbyumzlxyefpyxxao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993358.2354248-184-151519369301102/AnsiballZ_command.py'
Nov 24 14:09:18 compute-0 sudo[36022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:18 compute-0 python3.9[36024]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 24 14:09:19 compute-0 sudo[36022]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:19 compute-0 sudo[36175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgkkqmjiuqiafhkwheejiilnuubxwacc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993359.684637-192-147766125676634/AnsiballZ_file.py'
Nov 24 14:09:19 compute-0 sudo[36175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:20 compute-0 python3.9[36177]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:09:20 compute-0 sudo[36175]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:21 compute-0 sudo[36327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxsqaxvmiuimexzeeohccbbalrejssuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993361.0303354-200-9065320718271/AnsiballZ_mount.py'
Nov 24 14:09:21 compute-0 sudo[36327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:21 compute-0 python3.9[36329]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 24 14:09:21 compute-0 sudo[36327]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:22 compute-0 sudo[36479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ravieijvgqepxflbvlkfsxhkplbcdwvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993362.5459366-228-212280922860402/AnsiballZ_file.py'
Nov 24 14:09:22 compute-0 sudo[36479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:23 compute-0 python3.9[36481]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:09:23 compute-0 sudo[36479]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:23 compute-0 sudo[36631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiucmouabwojxtxwrziglwowoylzkhac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993363.2121787-236-62781755583613/AnsiballZ_stat.py'
Nov 24 14:09:23 compute-0 sudo[36631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:23 compute-0 python3.9[36633]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:09:23 compute-0 sudo[36631]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:24 compute-0 sudo[36754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awtwelghmtuvubonudahuxhtqgwclkky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993363.2121787-236-62781755583613/AnsiballZ_copy.py'
Nov 24 14:09:24 compute-0 sudo[36754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:24 compute-0 python3.9[36756]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993363.2121787-236-62781755583613/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:09:24 compute-0 sudo[36754]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:24 compute-0 sudo[36906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsilxmmnyfndijfcvuefrsefvxanyuav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993364.6580212-260-65184885424665/AnsiballZ_stat.py'
Nov 24 14:09:24 compute-0 sudo[36906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:28 compute-0 python3.9[36908]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:09:28 compute-0 sudo[36906]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:29 compute-0 sudo[37058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-erncewcghcdesjmyzjsworairqzvfjnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993368.980447-268-267628422783454/AnsiballZ_command.py'
Nov 24 14:09:29 compute-0 sudo[37058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:29 compute-0 python3.9[37060]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:29 compute-0 sudo[37058]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:30 compute-0 sudo[37211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynhmwrqbvclxeajzfnbfylinmeavxbtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993369.7505255-276-90944247357307/AnsiballZ_file.py'
Nov 24 14:09:30 compute-0 sudo[37211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:30 compute-0 python3.9[37213]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:09:30 compute-0 sudo[37211]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:31 compute-0 sudo[37363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbbjitdowrgwpldnxxlexrqnytsdoqvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993370.5758777-287-172045751258661/AnsiballZ_getent.py'
Nov 24 14:09:31 compute-0 sudo[37363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:31 compute-0 python3.9[37365]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 24 14:09:31 compute-0 sudo[37363]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:31 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:09:31 compute-0 sudo[37517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lypcgeugylwfyhnwrzlwbpfclgrsuywv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993371.394478-295-62495987013798/AnsiballZ_group.py'
Nov 24 14:09:31 compute-0 sudo[37517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:31 compute-0 python3.9[37519]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 14:09:32 compute-0 groupadd[37520]: group added to /etc/group: name=qemu, GID=107
Nov 24 14:09:32 compute-0 groupadd[37520]: group added to /etc/gshadow: name=qemu
Nov 24 14:09:32 compute-0 groupadd[37520]: new group: name=qemu, GID=107
Nov 24 14:09:32 compute-0 sudo[37517]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:32 compute-0 sudo[37675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvtmdsobixyagitgicrofaymdpeieagx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993372.2165291-303-124330392589723/AnsiballZ_user.py'
Nov 24 14:09:32 compute-0 sudo[37675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:32 compute-0 python3.9[37677]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 14:09:32 compute-0 useradd[37679]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 14:09:32 compute-0 sudo[37675]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:33 compute-0 sudo[37835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evboqpedmzidjeioupfxxuvpahaxapzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993373.130416-311-41568156929584/AnsiballZ_getent.py'
Nov 24 14:09:33 compute-0 sudo[37835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:33 compute-0 python3.9[37837]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 24 14:09:33 compute-0 sudo[37835]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:34 compute-0 sudo[37988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtizgcmabivjyiebtpqausdcoddoybtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993373.8603425-319-44360529051425/AnsiballZ_group.py'
Nov 24 14:09:34 compute-0 sudo[37988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:34 compute-0 python3.9[37990]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 14:09:34 compute-0 groupadd[37991]: group added to /etc/group: name=hugetlbfs, GID=42477
Nov 24 14:09:34 compute-0 groupadd[37991]: group added to /etc/gshadow: name=hugetlbfs
Nov 24 14:09:34 compute-0 groupadd[37991]: new group: name=hugetlbfs, GID=42477
Nov 24 14:09:34 compute-0 sudo[37988]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:34 compute-0 sudo[38146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmrqmjahpfbzosytyerbqfwtoqmwwpnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993374.6669166-328-89807226815005/AnsiballZ_file.py'
Nov 24 14:09:34 compute-0 sudo[38146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:35 compute-0 python3.9[38148]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 24 14:09:35 compute-0 sudo[38146]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:35 compute-0 sudo[38298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faeqgvswsaoyburdxaacpukaxrttakwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993375.501346-339-133071059345543/AnsiballZ_dnf.py'
Nov 24 14:09:35 compute-0 sudo[38298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:35 compute-0 python3.9[38300]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:09:37 compute-0 sudo[38298]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:37 compute-0 sudo[38451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bythsnapgwotyftnblbyjocnedzfhonm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993377.7081625-347-100509256564947/AnsiballZ_file.py'
Nov 24 14:09:37 compute-0 sudo[38451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:38 compute-0 python3.9[38453]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:09:38 compute-0 sudo[38451]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:38 compute-0 sudo[38603]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxomiuhrlnvgotdinbbjsaohxfqlrdxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993378.3163664-355-11101354318764/AnsiballZ_stat.py'
Nov 24 14:09:38 compute-0 sudo[38603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:38 compute-0 python3.9[38605]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:09:38 compute-0 sudo[38603]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:39 compute-0 sudo[38726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eelfzcqomdikrjyhreejcxzwvijubrav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993378.3163664-355-11101354318764/AnsiballZ_copy.py'
Nov 24 14:09:39 compute-0 sudo[38726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:39 compute-0 python3.9[38728]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993378.3163664-355-11101354318764/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:09:39 compute-0 sudo[38726]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:39 compute-0 sudo[38878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jimyehwttzplsmbszcdrrwuzdugogllo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993379.4284456-370-106024099350293/AnsiballZ_systemd.py'
Nov 24 14:09:39 compute-0 sudo[38878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:40 compute-0 python3.9[38880]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:09:40 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 24 14:09:40 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 24 14:09:40 compute-0 kernel: Bridge firewalling registered
Nov 24 14:09:40 compute-0 systemd-modules-load[38884]: Inserted module 'br_netfilter'
Nov 24 14:09:40 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 24 14:09:40 compute-0 sudo[38878]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:40 compute-0 sudo[39037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqpblfakvgswkdovcoikglajpmyjkdlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993380.6025321-378-26816065420925/AnsiballZ_stat.py'
Nov 24 14:09:40 compute-0 sudo[39037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:41 compute-0 python3.9[39039]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:09:41 compute-0 sudo[39037]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:41 compute-0 sudo[39160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iroiyrzxmpodtkuprdkvwosywesvazfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993380.6025321-378-26816065420925/AnsiballZ_copy.py'
Nov 24 14:09:41 compute-0 sudo[39160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:41 compute-0 python3.9[39162]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993380.6025321-378-26816065420925/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:09:41 compute-0 sudo[39160]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:42 compute-0 sudo[39312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urbaitxbtjvogwgnoqzxcxgnkumdypoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993381.9511874-396-217311361208615/AnsiballZ_dnf.py'
Nov 24 14:09:42 compute-0 sudo[39312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:42 compute-0 python3.9[39314]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:09:45 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:09:45 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:09:45 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:09:45 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:09:45 compute-0 systemd[1]: Reloading.
Nov 24 14:09:45 compute-0 systemd-rc-local-generator[39378]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:09:45 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:09:46 compute-0 sudo[39312]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:47 compute-0 python3.9[40475]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:09:47 compute-0 python3.9[41535]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 24 14:09:48 compute-0 python3.9[42409]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:09:48 compute-0 sudo[43314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imccyfndhofohblumnbwywrrhnueuzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993388.6677456-435-184973676044706/AnsiballZ_command.py'
Nov 24 14:09:48 compute-0 sudo[43314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:49 compute-0 python3.9[43332]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:49 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 24 14:09:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:09:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:09:49 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.513s CPU time.
Nov 24 14:09:49 compute-0 systemd[1]: run-rd467f1f1820a4534a6d143b46f394872.service: Deactivated successfully.
Nov 24 14:09:49 compute-0 systemd[1]: Starting Authorization Manager...
Nov 24 14:09:49 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 24 14:09:49 compute-0 polkitd[43722]: Started polkitd version 0.117
Nov 24 14:09:49 compute-0 polkitd[43722]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 14:09:49 compute-0 polkitd[43722]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 14:09:49 compute-0 polkitd[43722]: Finished loading, compiling and executing 2 rules
Nov 24 14:09:49 compute-0 systemd[1]: Started Authorization Manager.
Nov 24 14:09:49 compute-0 polkitd[43722]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Nov 24 14:09:49 compute-0 sudo[43314]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:50 compute-0 sudo[43890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksrdztxtmwxsbqnpoeisfofiyntuzxlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993389.9707332-444-222379521261375/AnsiballZ_systemd.py'
Nov 24 14:09:50 compute-0 sudo[43890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:50 compute-0 python3.9[43892]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:09:50 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 24 14:09:50 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Nov 24 14:09:50 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 24 14:09:50 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 24 14:09:50 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 24 14:09:50 compute-0 sudo[43890]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:51 compute-0 python3.9[44054]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 24 14:09:53 compute-0 sudo[44204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwcyyrumvssccjjsadzbbidxpkckvhug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993392.9772737-501-232384179039951/AnsiballZ_systemd.py'
Nov 24 14:09:53 compute-0 sudo[44204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:53 compute-0 python3.9[44206]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:09:53 compute-0 systemd[1]: Reloading.
Nov 24 14:09:53 compute-0 systemd-rc-local-generator[44232]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:09:53 compute-0 sudo[44204]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:54 compute-0 sudo[44393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msxoundwndgjomvxguvcjptsnadmsdlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993393.9548125-501-123093435342308/AnsiballZ_systemd.py'
Nov 24 14:09:54 compute-0 sudo[44393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:54 compute-0 python3.9[44395]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:09:54 compute-0 systemd[1]: Reloading.
Nov 24 14:09:54 compute-0 systemd-rc-local-generator[44426]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:09:54 compute-0 sudo[44393]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:55 compute-0 sudo[44583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvxjaezjstjunnsagkcaitegiuboiqci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993395.0534697-517-236978054773790/AnsiballZ_command.py'
Nov 24 14:09:55 compute-0 sudo[44583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:55 compute-0 python3.9[44585]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:55 compute-0 sudo[44583]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:55 compute-0 sudo[44736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibjgozanckgdmkxfdvsnebwcfwfgximz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993395.7258673-525-164307331322054/AnsiballZ_command.py'
Nov 24 14:09:55 compute-0 sudo[44736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:56 compute-0 python3.9[44738]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:56 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 24 14:09:56 compute-0 sudo[44736]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:56 compute-0 sudo[44889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dopyarvczjwyagqqjiecefnpdggclmkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993396.3683724-533-155895929702965/AnsiballZ_command.py'
Nov 24 14:09:56 compute-0 sudo[44889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:56 compute-0 python3.9[44891]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:58 compute-0 sudo[44889]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:58 compute-0 sudo[45051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmtnhmpqxdcltfysaettskxoeguoodui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993398.3229682-541-244092471785259/AnsiballZ_command.py'
Nov 24 14:09:58 compute-0 sudo[45051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:58 compute-0 python3.9[45053]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:09:58 compute-0 sudo[45051]: pam_unix(sudo:session): session closed for user root
Nov 24 14:09:59 compute-0 sudo[45204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iidujlbkvhqrikjeusbqaccjmwpqarns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993398.939459-549-246448855217390/AnsiballZ_systemd.py'
Nov 24 14:09:59 compute-0 sudo[45204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:09:59 compute-0 python3.9[45206]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:09:59 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 24 14:09:59 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Nov 24 14:09:59 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Nov 24 14:09:59 compute-0 systemd[1]: Starting Apply Kernel Variables...
Nov 24 14:09:59 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 24 14:09:59 compute-0 systemd[1]: Finished Apply Kernel Variables.
Nov 24 14:09:59 compute-0 sudo[45204]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:00 compute-0 sshd-session[31548]: Connection closed by 192.168.122.30 port 36604
Nov 24 14:10:00 compute-0 sshd-session[31545]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:10:00 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Nov 24 14:10:00 compute-0 systemd[1]: session-10.scope: Consumed 2min 7.495s CPU time.
Nov 24 14:10:00 compute-0 systemd-logind[807]: Session 10 logged out. Waiting for processes to exit.
Nov 24 14:10:00 compute-0 systemd-logind[807]: Removed session 10.
Nov 24 14:10:05 compute-0 sshd-session[45236]: Accepted publickey for zuul from 192.168.122.30 port 36476 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:10:05 compute-0 systemd-logind[807]: New session 11 of user zuul.
Nov 24 14:10:05 compute-0 systemd[1]: Started Session 11 of User zuul.
Nov 24 14:10:05 compute-0 sshd-session[45236]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:10:06 compute-0 python3.9[45389]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:10:07 compute-0 python3.9[45543]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:10:08 compute-0 sudo[45697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwydrnyjiovqautmgpdirvudcuoadznn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993408.3859196-50-164272726536497/AnsiballZ_command.py'
Nov 24 14:10:08 compute-0 sudo[45697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:08 compute-0 python3.9[45699]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:10:08 compute-0 sudo[45697]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:09 compute-0 python3.9[45850]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:10:10 compute-0 sudo[46004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbqiafwhczuvcudscxmeyjqxsyjfqbnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993410.2246354-70-215204064460922/AnsiballZ_setup.py'
Nov 24 14:10:10 compute-0 sudo[46004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:10 compute-0 python3.9[46006]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:10:11 compute-0 sudo[46004]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:11 compute-0 sudo[46088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obdqbvxrafwmjfzdpernuzhptiwhhofq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993410.2246354-70-215204064460922/AnsiballZ_dnf.py'
Nov 24 14:10:11 compute-0 sudo[46088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:11 compute-0 python3.9[46090]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:10:13 compute-0 sudo[46088]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:13 compute-0 sudo[46241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llukcatnjdimczoytxerjbnsejamkgpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993413.2838912-82-212357929895436/AnsiballZ_setup.py'
Nov 24 14:10:13 compute-0 sudo[46241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:13 compute-0 python3.9[46243]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:10:14 compute-0 sudo[46241]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:14 compute-0 sudo[46412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvugyoatqftxgzhgequstwrvonqxucua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993414.3163633-93-218619958495271/AnsiballZ_file.py'
Nov 24 14:10:14 compute-0 sudo[46412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:14 compute-0 python3.9[46414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:10:14 compute-0 sudo[46412]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:15 compute-0 sudo[46564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkrrctlfonvpjvdfllvacsrwipjxfert ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993415.1450765-101-105229490830296/AnsiballZ_command.py'
Nov 24 14:10:15 compute-0 sudo[46564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:15 compute-0 python3.9[46566]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:10:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2536616095-merged.mount: Deactivated successfully.
Nov 24 14:10:15 compute-0 podman[46567]: 2025-11-24 14:10:15.678516516 +0000 UTC m=+0.069812858 system refresh
Nov 24 14:10:15 compute-0 sudo[46564]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:16 compute-0 sudo[46728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqlrskrkbugntxlzcibdnalyuioslxkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993415.8716655-109-153386072825499/AnsiballZ_stat.py'
Nov 24 14:10:16 compute-0 sudo[46728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:16 compute-0 python3.9[46730]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:10:16 compute-0 sudo[46728]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:10:16 compute-0 sudo[46851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxgarhbmmnifezcqvfknctixxqoxdmhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993415.8716655-109-153386072825499/AnsiballZ_copy.py'
Nov 24 14:10:16 compute-0 sudo[46851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:17 compute-0 python3.9[46853]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993415.8716655-109-153386072825499/.source.json follow=False _original_basename=podman_network_config.j2 checksum=947a1db8d13e45805e30ce91caf9b807bff77d9c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:10:17 compute-0 sudo[46851]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:17 compute-0 sudo[47003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grkphxsuxdttekknqztuijdlhoyvxvwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993417.287898-124-257209834116921/AnsiballZ_stat.py'
Nov 24 14:10:17 compute-0 sudo[47003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:17 compute-0 python3.9[47005]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:10:17 compute-0 sudo[47003]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:18 compute-0 sudo[47126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vawxdouktwjsbewhvbmiqctqcsnedmgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993417.287898-124-257209834116921/AnsiballZ_copy.py'
Nov 24 14:10:18 compute-0 sudo[47126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:18 compute-0 python3.9[47128]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993417.287898-124-257209834116921/.source.conf follow=False _original_basename=registries.conf.j2 checksum=8644ebda17e39829f9c781e11f280a719cb9e13f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:10:18 compute-0 sudo[47126]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:19 compute-0 sudo[47278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oszapirfsofihjdwtvhmqhtclqznsdot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993418.6345868-140-264749791080100/AnsiballZ_ini_file.py'
Nov 24 14:10:19 compute-0 sudo[47278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:19 compute-0 python3.9[47280]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:10:19 compute-0 sudo[47278]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:19 compute-0 irqbalance[784]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 24 14:10:19 compute-0 irqbalance[784]: IRQ 26 affinity is now unmanaged
Nov 24 14:10:19 compute-0 sudo[47430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqgfrmsndqfdhazrfpdhyttgwkmtupbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993419.410394-140-222210699766059/AnsiballZ_ini_file.py'
Nov 24 14:10:19 compute-0 sudo[47430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:19 compute-0 python3.9[47432]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:10:19 compute-0 sudo[47430]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:20 compute-0 sudo[47582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyvbxrimbpmunhydfanjngwdzzetmwfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993420.0773883-140-40563061510629/AnsiballZ_ini_file.py'
Nov 24 14:10:20 compute-0 sudo[47582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:20 compute-0 python3.9[47584]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:10:20 compute-0 sudo[47582]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:21 compute-0 sudo[47734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bymxybzhewyyuyqheorefkzrnzciwrxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993420.7298388-140-15847344579318/AnsiballZ_ini_file.py'
Nov 24 14:10:21 compute-0 sudo[47734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:21 compute-0 python3.9[47736]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:10:21 compute-0 sudo[47734]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:22 compute-0 python3.9[47886]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:10:22 compute-0 sudo[48038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exfqfqdywkbbdatblczryqpybeeqogcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993422.467009-180-42054520877417/AnsiballZ_dnf.py'
Nov 24 14:10:22 compute-0 sudo[48038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:22 compute-0 python3.9[48040]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:24 compute-0 sudo[48038]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:24 compute-0 sudo[48191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oukkmgtoytiavogimqvhaqgfwgqlnhem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993424.3957162-188-246223035218581/AnsiballZ_dnf.py'
Nov 24 14:10:24 compute-0 sudo[48191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:24 compute-0 python3.9[48193]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:26 compute-0 sudo[48191]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:28 compute-0 sudo[48351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hprkvelrjgwyqjqnutosikxyspsrkfar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993428.6745358-198-52375658740484/AnsiballZ_dnf.py'
Nov 24 14:10:28 compute-0 sudo[48351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:29 compute-0 python3.9[48353]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:30 compute-0 sudo[48351]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:30 compute-0 sudo[48504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xchykjgydmaokgskxhzteeifayyhmwau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993430.6619976-207-2242334207348/AnsiballZ_dnf.py'
Nov 24 14:10:30 compute-0 sudo[48504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:31 compute-0 python3.9[48506]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:32 compute-0 sudo[48504]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:32 compute-0 sudo[48657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddqtyiwgmxdwwcqlfnrrsjtxqrlwyoun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993432.731188-218-129621075358352/AnsiballZ_dnf.py'
Nov 24 14:10:32 compute-0 sudo[48657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:33 compute-0 python3.9[48659]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:35 compute-0 sudo[48657]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:35 compute-0 sudo[48813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abbpwdlpdvcteehdsmuynqkbkxrnwtcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993435.20592-226-164257115221845/AnsiballZ_dnf.py'
Nov 24 14:10:35 compute-0 sudo[48813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:35 compute-0 python3.9[48815]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:37 compute-0 sudo[48813]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:38 compute-0 sudo[48981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfgnudnhncsbriufdcjgkwxzdirxvmaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993438.235329-235-144807752487485/AnsiballZ_dnf.py'
Nov 24 14:10:38 compute-0 sudo[48981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:38 compute-0 python3.9[48983]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:40 compute-0 sudo[48981]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:40 compute-0 sudo[49134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eagqesvwiskehoedujaulhcvsbhlzrdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993440.45795-244-118728033414446/AnsiballZ_dnf.py'
Nov 24 14:10:40 compute-0 sudo[49134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:40 compute-0 python3.9[49136]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:53 compute-0 sudo[49134]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:53 compute-0 sudo[49471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvvakntghpmqtyziqdmekarjscbuuaud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993453.617134-253-141593414855335/AnsiballZ_dnf.py'
Nov 24 14:10:53 compute-0 sudo[49471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:54 compute-0 python3.9[49473]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:10:55 compute-0 sudo[49471]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:56 compute-0 sudo[49627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvvbrjybpfbnejrkgzttuajjddvlhonh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993455.9370053-264-920835567500/AnsiballZ_file.py'
Nov 24 14:10:56 compute-0 sudo[49627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:56 compute-0 python3.9[49629]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:10:56 compute-0 sudo[49627]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:56 compute-0 sudo[49802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrhaqgveybsbtzmotlcumtlgspqwzdtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993456.58318-272-66777334190740/AnsiballZ_stat.py'
Nov 24 14:10:56 compute-0 sudo[49802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:57 compute-0 python3.9[49804]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:10:57 compute-0 sudo[49802]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:57 compute-0 sudo[49925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiwkbclgohvmetfnhqhymiqrhsahwlml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993456.58318-272-66777334190740/AnsiballZ_copy.py'
Nov 24 14:10:57 compute-0 sudo[49925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:57 compute-0 python3.9[49927]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1763993456.58318-272-66777334190740/.source.json _original_basename=.0fej7qmq follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:10:57 compute-0 sudo[49925]: pam_unix(sudo:session): session closed for user root
Nov 24 14:10:58 compute-0 sudo[50077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twwqpbpbzalcfxeewbmystcfduldcvdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993457.954932-290-161625319295026/AnsiballZ_podman_image.py'
Nov 24 14:10:58 compute-0 sudo[50077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:10:58 compute-0 python3.9[50079]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 14:10:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat594778477-lower\x2dmapped.mount: Deactivated successfully.
Nov 24 14:11:04 compute-0 podman[50092]: 2025-11-24 14:11:04.618313543 +0000 UTC m=+5.896683805 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e
Nov 24 14:11:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:04 compute-0 sudo[50077]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:05 compute-0 sudo[50388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuoamqweapbtdxtdzqytbzijuxdeqsbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993465.1984499-301-257271772522048/AnsiballZ_podman_image.py'
Nov 24 14:11:05 compute-0 sudo[50388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:05 compute-0 python3.9[50390]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 14:11:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:16 compute-0 podman[50402]: 2025-11-24 14:11:16.717921762 +0000 UTC m=+10.902944290 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:11:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:16 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:16 compute-0 sudo[50388]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:17 compute-0 sudo[50735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqdxuzrgitzfqmbvbpwzyirtwqbcomjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993477.246106-311-502815195859/AnsiballZ_podman_image.py'
Nov 24 14:11:17 compute-0 sudo[50735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:17 compute-0 python3.9[50737]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 14:11:17 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:19 compute-0 podman[50750]: 2025-11-24 14:11:19.321698037 +0000 UTC m=+1.532427449 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24
Nov 24 14:11:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:19 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:19 compute-0 sudo[50735]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:20 compute-0 sudo[50983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gicljhwbqotzhthvqlenooaiqakfgkgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993479.774207-320-258566134728514/AnsiballZ_podman_image.py'
Nov 24 14:11:20 compute-0 sudo[50983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:20 compute-0 python3.9[50985]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 14:11:33 compute-0 podman[50996]: 2025-11-24 14:11:33.845604041 +0000 UTC m=+13.561513597 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076
Nov 24 14:11:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:34 compute-0 sudo[50983]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:34 compute-0 sudo[51267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fegzorkxvoapmsborqdfujbvvnzfgaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993494.5578482-331-167033946423643/AnsiballZ_podman_image.py'
Nov 24 14:11:34 compute-0 sudo[51267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:35 compute-0 python3.9[51269]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 14:11:35 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:40 compute-0 podman[51282]: 2025-11-24 14:11:40.903742095 +0000 UTC m=+5.749131354 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003
Nov 24 14:11:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:41 compute-0 sudo[51267]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:41 compute-0 sudo[51536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rccxonyffshbfefouushrnoqtymlfrxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993501.3072321-331-102988686889734/AnsiballZ_podman_image.py'
Nov 24 14:11:41 compute-0 sudo[51536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:41 compute-0 python3.9[51538]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 24 14:11:43 compute-0 podman[51551]: 2025-11-24 14:11:43.212718879 +0000 UTC m=+1.328232376 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 24 14:11:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:11:43 compute-0 sudo[51536]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:43 compute-0 sshd-session[45239]: Connection closed by 192.168.122.30 port 36476
Nov 24 14:11:43 compute-0 sshd-session[45236]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:11:43 compute-0 systemd-logind[807]: Session 11 logged out. Waiting for processes to exit.
Nov 24 14:11:43 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Nov 24 14:11:43 compute-0 systemd[1]: session-11.scope: Consumed 1min 50.492s CPU time.
Nov 24 14:11:43 compute-0 systemd-logind[807]: Removed session 11.
Nov 24 14:11:49 compute-0 sshd-session[51701]: Accepted publickey for zuul from 192.168.122.30 port 55658 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:11:49 compute-0 systemd-logind[807]: New session 12 of user zuul.
Nov 24 14:11:49 compute-0 systemd[1]: Started Session 12 of User zuul.
Nov 24 14:11:49 compute-0 sshd-session[51701]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:11:50 compute-0 python3.9[51854]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:11:51 compute-0 sudo[52008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gprltukkylvlzwpnrdvupiqqreavpfsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993511.2960353-36-84063866076654/AnsiballZ_getent.py'
Nov 24 14:11:51 compute-0 sudo[52008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:51 compute-0 python3.9[52010]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 24 14:11:52 compute-0 sudo[52008]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:52 compute-0 sudo[52161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjojhmzyblnnqqsvfnlittfqskcdpurm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993512.2118745-44-195142338997099/AnsiballZ_group.py'
Nov 24 14:11:52 compute-0 sudo[52161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:52 compute-0 python3.9[52163]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 14:11:52 compute-0 groupadd[52164]: group added to /etc/group: name=openvswitch, GID=42476
Nov 24 14:11:52 compute-0 groupadd[52164]: group added to /etc/gshadow: name=openvswitch
Nov 24 14:11:52 compute-0 groupadd[52164]: new group: name=openvswitch, GID=42476
Nov 24 14:11:52 compute-0 sudo[52161]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:53 compute-0 sudo[52319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dymkcdrjzlwoksvmbhvoookaqknhhccx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993513.1060362-52-191258854218305/AnsiballZ_user.py'
Nov 24 14:11:53 compute-0 sudo[52319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:53 compute-0 python3.9[52321]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 14:11:53 compute-0 useradd[52323]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 14:11:53 compute-0 useradd[52323]: add 'openvswitch' to group 'hugetlbfs'
Nov 24 14:11:53 compute-0 useradd[52323]: add 'openvswitch' to shadow group 'hugetlbfs'
Nov 24 14:11:53 compute-0 sudo[52319]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:54 compute-0 sudo[52479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lppywkcuoowpgskjryeaugzhdfwpzegk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993514.1292872-62-99465288571295/AnsiballZ_setup.py'
Nov 24 14:11:54 compute-0 sudo[52479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:54 compute-0 python3.9[52481]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:11:54 compute-0 sudo[52479]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:55 compute-0 sudo[52563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgdnsjujdrrwtwwvrjhxocrwirmimmyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993514.1292872-62-99465288571295/AnsiballZ_dnf.py'
Nov 24 14:11:55 compute-0 sudo[52563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:55 compute-0 python3.9[52565]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:11:57 compute-0 sudo[52563]: pam_unix(sudo:session): session closed for user root
Nov 24 14:11:57 compute-0 sudo[52725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cojrtqoinqticiwjccrxyblytavmkymk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993517.3225377-76-226500487131396/AnsiballZ_dnf.py'
Nov 24 14:11:57 compute-0 sudo[52725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:11:57 compute-0 python3.9[52727]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:12:11 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 14:12:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 14:12:11 compute-0 groupadd[52751]: group added to /etc/group: name=unbound, GID=993
Nov 24 14:12:11 compute-0 groupadd[52751]: group added to /etc/gshadow: name=unbound
Nov 24 14:12:11 compute-0 groupadd[52751]: new group: name=unbound, GID=993
Nov 24 14:12:11 compute-0 useradd[52758]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Nov 24 14:12:11 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 24 14:12:11 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 24 14:12:12 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:12:12 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:12:12 compute-0 systemd[1]: Reloading.
Nov 24 14:12:13 compute-0 systemd-sysv-generator[53260]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:12:13 compute-0 systemd-rc-local-generator[53256]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:12:13 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:12:13 compute-0 sudo[52725]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:13 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:12:13 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:12:13 compute-0 systemd[1]: run-r4b1bb84c951d443388bbe093cd71a871.service: Deactivated successfully.
Nov 24 14:12:14 compute-0 sudo[53824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwxhdzegqmqzftjurnfbmifwmiqpacwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993533.8329632-84-79200602936921/AnsiballZ_systemd.py'
Nov 24 14:12:14 compute-0 sudo[53824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:14 compute-0 python3.9[53826]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:12:14 compute-0 systemd[1]: Reloading.
Nov 24 14:12:14 compute-0 systemd-rc-local-generator[53856]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:12:14 compute-0 systemd-sysv-generator[53859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:12:15 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Nov 24 14:12:15 compute-0 chown[53867]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 24 14:12:15 compute-0 ovs-ctl[53872]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 24 14:12:15 compute-0 ovs-ctl[53872]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 24 14:12:15 compute-0 ovs-ctl[53872]: Starting ovsdb-server [  OK  ]
Nov 24 14:12:15 compute-0 ovs-vsctl[53921]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 24 14:12:15 compute-0 ovs-vsctl[53941]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"dfd2f9fd-c9ed-4d16-a231-48176f986586\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 24 14:12:15 compute-0 ovs-ctl[53872]: Configuring Open vSwitch system IDs [  OK  ]
Nov 24 14:12:15 compute-0 ovs-vsctl[53946]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 24 14:12:15 compute-0 ovs-ctl[53872]: Enabling remote OVSDB managers [  OK  ]
Nov 24 14:12:15 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Nov 24 14:12:15 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 24 14:12:15 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 24 14:12:15 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 24 14:12:15 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Nov 24 14:12:15 compute-0 ovs-ctl[53991]: Inserting openvswitch module [  OK  ]
Nov 24 14:12:15 compute-0 ovs-ctl[53960]: Starting ovs-vswitchd [  OK  ]
Nov 24 14:12:15 compute-0 ovs-vsctl[54008]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Nov 24 14:12:15 compute-0 ovs-ctl[53960]: Enabling remote OVSDB managers [  OK  ]
Nov 24 14:12:15 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 24 14:12:15 compute-0 systemd[1]: Starting Open vSwitch...
Nov 24 14:12:15 compute-0 systemd[1]: Finished Open vSwitch.
Nov 24 14:12:15 compute-0 sudo[53824]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:16 compute-0 python3.9[54160]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:12:17 compute-0 sudo[54310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsnihnibuunllhncbhkxotpjjlepciro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993537.004212-102-194281598292194/AnsiballZ_sefcontext.py'
Nov 24 14:12:17 compute-0 sudo[54310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:17 compute-0 python3.9[54312]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 24 14:12:18 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 14:12:18 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 14:12:19 compute-0 sudo[54310]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:19 compute-0 python3.9[54468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:12:20 compute-0 sudo[54624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwgvvtqwyixwhxcgsgihtfsywffdwydq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993540.3593516-120-253956524738375/AnsiballZ_dnf.py'
Nov 24 14:12:20 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 24 14:12:20 compute-0 sudo[54624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:20 compute-0 python3.9[54626]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:12:22 compute-0 sudo[54624]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:22 compute-0 sudo[54777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkopinqhptisyynzrwwdycbzpzebxqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993542.4377255-128-61106666729737/AnsiballZ_command.py'
Nov 24 14:12:22 compute-0 sudo[54777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:23 compute-0 python3.9[54779]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:12:23 compute-0 sudo[54777]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:24 compute-0 sudo[55064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulstqljadazvraggmkcpjjivtbcfftog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993544.0503287-136-268466252138606/AnsiballZ_file.py'
Nov 24 14:12:24 compute-0 sudo[55064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:24 compute-0 python3.9[55066]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 14:12:24 compute-0 sudo[55064]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:25 compute-0 python3.9[55216]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:12:25 compute-0 sudo[55368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-viupcbrxyvrjusllrtelznttfpssemes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993545.6739454-152-87152384896017/AnsiballZ_dnf.py'
Nov 24 14:12:25 compute-0 sudo[55368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:26 compute-0 python3.9[55370]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:12:28 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:12:28 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:12:28 compute-0 systemd[1]: Reloading.
Nov 24 14:12:28 compute-0 systemd-sysv-generator[55409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:12:28 compute-0 systemd-rc-local-generator[55405]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:12:28 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:12:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:12:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:12:28 compute-0 systemd[1]: run-r53b4b59e577040adba6d886f3de9ca57.service: Deactivated successfully.
Nov 24 14:12:28 compute-0 sudo[55368]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:29 compute-0 sudo[55685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqxuycmoocshiqzwmtjobpmmhanforsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993548.78399-160-126497304072494/AnsiballZ_systemd.py'
Nov 24 14:12:29 compute-0 sudo[55685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:29 compute-0 python3.9[55687]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:12:29 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 24 14:12:29 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Nov 24 14:12:29 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Nov 24 14:12:29 compute-0 systemd[1]: Stopping Network Manager...
Nov 24 14:12:29 compute-0 NetworkManager[7187]: <info>  [1763993549.3483] caught SIGTERM, shutting down normally.
Nov 24 14:12:29 compute-0 NetworkManager[7187]: <info>  [1763993549.3500] dhcp4 (eth0): canceled DHCP transaction
Nov 24 14:12:29 compute-0 NetworkManager[7187]: <info>  [1763993549.3500] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 14:12:29 compute-0 NetworkManager[7187]: <info>  [1763993549.3500] dhcp4 (eth0): state changed no lease
Nov 24 14:12:29 compute-0 NetworkManager[7187]: <info>  [1763993549.3504] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 14:12:29 compute-0 NetworkManager[7187]: <info>  [1763993549.3589] exiting (success)
Nov 24 14:12:29 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 14:12:29 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 24 14:12:29 compute-0 systemd[1]: Stopped Network Manager.
Nov 24 14:12:29 compute-0 systemd[1]: NetworkManager.service: Consumed 8.811s CPU time, 4.1M memory peak, read 0B from disk, written 18.0K to disk.
Nov 24 14:12:29 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 14:12:29 compute-0 systemd[1]: Starting Network Manager...
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.4185] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:07f12fa4-da71-4102-93d7-808aadc1fc71)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.4188] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.4239] manager[0x556958cef090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 24 14:12:29 compute-0 systemd[1]: Starting Hostname Service...
Nov 24 14:12:29 compute-0 systemd[1]: Started Hostname Service.
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5054] hostname: hostname: using hostnamed
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5055] hostname: static hostname changed from (none) to "compute-0"
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5060] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5065] manager[0x556958cef090]: rfkill: Wi-Fi hardware radio set enabled
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5065] manager[0x556958cef090]: rfkill: WWAN hardware radio set enabled
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5084] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5092] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5093] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5093] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5094] manager: Networking is enabled by state file
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5096] settings: Loaded settings plugin: keyfile (internal)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5099] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5120] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5129] dhcp: init: Using DHCP client 'internal'
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5130] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5134] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5140] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5146] device (lo): Activation: starting connection 'lo' (e622f3c8-2558-4f9f-886b-8e216d717dfd)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5151] device (eth0): carrier: link connected
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5154] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5157] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5158] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5162] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5180] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5187] device (eth1): carrier: link connected
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5191] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5195] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (7049c056-21f5-55fc-906e-9890c70fc7c7) (indicated)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5196] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5200] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5207] device (eth1): Activation: starting connection 'ci-private-network' (7049c056-21f5-55fc-906e-9890c70fc7c7)
Nov 24 14:12:29 compute-0 systemd[1]: Started Network Manager.
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5213] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5220] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5222] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5224] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5227] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5229] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5230] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5232] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5234] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5239] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5241] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5251] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5272] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5280] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5284] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5290] device (lo): Activation: successful, device activated.
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5297] dhcp4 (eth0): state changed new lease, address=38.102.83.214
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5304] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5361] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5364] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5370] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5373] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 24 14:12:29 compute-0 systemd[1]: Starting Network Manager Wait Online...
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5387] device (eth1): Activation: successful, device activated.
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5400] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5401] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5405] manager: NetworkManager state is now CONNECTED_SITE
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5409] device (eth0): Activation: successful, device activated.
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5415] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 24 14:12:29 compute-0 NetworkManager[55697]: <info>  [1763993549.5419] manager: startup complete
Nov 24 14:12:29 compute-0 sudo[55685]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:29 compute-0 systemd[1]: Finished Network Manager Wait Online.
Nov 24 14:12:30 compute-0 sudo[55911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gywaflgkwqlioubcbkjyzfclfqwbkmef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993549.7310412-168-182416422717783/AnsiballZ_dnf.py'
Nov 24 14:12:30 compute-0 sudo[55911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:30 compute-0 python3.9[55913]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:12:35 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:12:35 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:12:35 compute-0 systemd[1]: Reloading.
Nov 24 14:12:35 compute-0 systemd-rc-local-generator[55963]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:12:35 compute-0 systemd-sysv-generator[55966]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:12:35 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:12:36 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:12:36 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:12:36 compute-0 systemd[1]: run-ra0605a331b554259975d55a78d6b5924.service: Deactivated successfully.
Nov 24 14:12:36 compute-0 sudo[55911]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:37 compute-0 sudo[56369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-angseyfutuwxqbguvkidsmckchrqdtpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993556.9131074-180-129345885089004/AnsiballZ_stat.py'
Nov 24 14:12:37 compute-0 sudo[56369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:37 compute-0 python3.9[56371]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:12:37 compute-0 sudo[56369]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:37 compute-0 sudo[56521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmdftegwlofsrlddsjxrxcnuhxhpxgdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993557.5551133-189-267912632284056/AnsiballZ_ini_file.py'
Nov 24 14:12:37 compute-0 sudo[56521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:38 compute-0 python3.9[56523]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:38 compute-0 sudo[56521]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:38 compute-0 sudo[56675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldlnipvufyhpvhvpafpqyfblrhlxhxnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993558.4543357-199-48783113986623/AnsiballZ_ini_file.py'
Nov 24 14:12:38 compute-0 sudo[56675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:38 compute-0 python3.9[56677]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:38 compute-0 sudo[56675]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:39 compute-0 sudo[56827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlitybmlzsrwmppujcemgcgeglupijhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993559.0826197-199-199591209668820/AnsiballZ_ini_file.py'
Nov 24 14:12:39 compute-0 sudo[56827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:39 compute-0 python3.9[56829]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:39 compute-0 sudo[56827]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:39 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 14:12:39 compute-0 sudo[56980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykdjbhxoieoaeracawanpyhjkjaqnuju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993559.6822882-214-60758270478769/AnsiballZ_ini_file.py'
Nov 24 14:12:39 compute-0 sudo[56980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:40 compute-0 python3.9[56982]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:40 compute-0 sudo[56980]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:40 compute-0 sudo[57132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzubpzjpicxcgdknugllhyuwlvjktqzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993560.220211-214-7916667259320/AnsiballZ_ini_file.py'
Nov 24 14:12:40 compute-0 sudo[57132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:40 compute-0 python3.9[57134]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:40 compute-0 sudo[57132]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:41 compute-0 sudo[57284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arvbvisopyadxkidozjriqylxkxfiygh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993560.8193102-229-266948067426194/AnsiballZ_stat.py'
Nov 24 14:12:41 compute-0 sudo[57284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:41 compute-0 python3.9[57286]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:12:41 compute-0 sudo[57284]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:41 compute-0 sudo[57407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emifbatabdotgtptqzdugjwtngenzncz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993560.8193102-229-266948067426194/AnsiballZ_copy.py'
Nov 24 14:12:41 compute-0 sudo[57407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:41 compute-0 python3.9[57409]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993560.8193102-229-266948067426194/.source _original_basename=.gak35bln follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:41 compute-0 sudo[57407]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:42 compute-0 sudo[57559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yabfkpclaeixzdvbcevljxzpwbdmzvhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993562.0864277-244-31598725276258/AnsiballZ_file.py'
Nov 24 14:12:42 compute-0 sudo[57559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:42 compute-0 python3.9[57561]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:42 compute-0 sudo[57559]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:43 compute-0 sudo[57711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szxgxsavkvrqrznpfonjlavlfnhpttkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993562.71077-252-2073745782557/AnsiballZ_edpm_os_net_config_mappings.py'
Nov 24 14:12:43 compute-0 sudo[57711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:43 compute-0 python3.9[57713]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 24 14:12:43 compute-0 sudo[57711]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:43 compute-0 sudo[57863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mysbogkevsmohbqvmlgzvhnrmanqbkah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993563.627608-261-185292159739043/AnsiballZ_file.py'
Nov 24 14:12:43 compute-0 sudo[57863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:44 compute-0 python3.9[57865]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:44 compute-0 sudo[57863]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:44 compute-0 sudo[58015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahrfxnvtemtoklazrqwimyegjvwyyrdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993564.4308884-271-106316705447533/AnsiballZ_stat.py'
Nov 24 14:12:44 compute-0 sudo[58015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:44 compute-0 sudo[58015]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:45 compute-0 sudo[58138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbjfvnpmmxthkfgjddgdnlniyhzvtny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993564.4308884-271-106316705447533/AnsiballZ_copy.py'
Nov 24 14:12:45 compute-0 sudo[58138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:45 compute-0 sudo[58138]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:45 compute-0 sudo[58290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsubgoueucyenzhmthidchbvqjlkvwcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993565.513635-286-205571117869026/AnsiballZ_slurp.py'
Nov 24 14:12:45 compute-0 sudo[58290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:46 compute-0 python3.9[58292]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 24 14:12:46 compute-0 sudo[58290]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:47 compute-0 sudo[58465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmfhyuquzfebimllvhszjsgjhedvggfn ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993566.4517694-295-94365154441459/async_wrapper.py j24563145645 300 /home/zuul/.ansible/tmp/ansible-tmp-1763993566.4517694-295-94365154441459/AnsiballZ_edpm_os_net_config.py _'
Nov 24 14:12:47 compute-0 sudo[58465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:47 compute-0 ansible-async_wrapper.py[58467]: Invoked with j24563145645 300 /home/zuul/.ansible/tmp/ansible-tmp-1763993566.4517694-295-94365154441459/AnsiballZ_edpm_os_net_config.py _
Nov 24 14:12:47 compute-0 ansible-async_wrapper.py[58470]: Starting module and watcher
Nov 24 14:12:47 compute-0 ansible-async_wrapper.py[58470]: Start watching 58471 (300)
Nov 24 14:12:47 compute-0 ansible-async_wrapper.py[58471]: Start module (58471)
Nov 24 14:12:47 compute-0 ansible-async_wrapper.py[58467]: Return async_wrapper task started.
Nov 24 14:12:47 compute-0 sudo[58465]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:47 compute-0 python3.9[58472]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 24 14:12:48 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 24 14:12:48 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 24 14:12:48 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 24 14:12:48 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 24 14:12:48 compute-0 kernel: cfg80211: failed to load regulatory.db
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1370] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1383] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1863] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1864] audit: op="connection-add" uuid="71e14b9d-d2e9-4290-adca-bf2540856e0a" name="br-ex-br" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1879] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1880] audit: op="connection-add" uuid="7024ae72-caf0-4d68-8395-0af1e561ffb0" name="br-ex-port" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1891] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1892] audit: op="connection-add" uuid="862795d1-29ea-47b4-acdb-5ef5960dfcf5" name="eth1-port" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1902] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1903] audit: op="connection-add" uuid="e8a8f147-70c7-4754-bb3e-3865bc1ec747" name="vlan20-port" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1913] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1914] audit: op="connection-add" uuid="4bc6041b-97d0-460b-82f5-faa3ff37be16" name="vlan21-port" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1924] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1925] audit: op="connection-add" uuid="2971fba7-c9d9-44da-9eca-31a2c85797ad" name="vlan22-port" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1943] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method,ipv6.dhcp-timeout" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1957] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.1959] audit: op="connection-add" uuid="57be861f-4893-41bf-981d-342a6f993f0c" name="br-ex-if" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2046] audit: op="connection-update" uuid="7049c056-21f5-55fc-906e-9890c70fc7c7" name="ci-private-network" args="connection.controller,connection.slave-type,connection.port-type,connection.timestamp,connection.master,ipv4.addresses,ipv4.method,ipv4.dns,ipv4.routes,ipv4.never-default,ipv4.routing-rules,ovs-external-ids.data,ipv6.addr-gen-mode,ipv6.addresses,ipv6.dns,ipv6.routes,ipv6.method,ipv6.routing-rules,ovs-interface.type" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2061] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2063] audit: op="connection-add" uuid="e926046b-4bb7-4a2b-99be-c37f0dabb0fa" name="vlan20-if" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2078] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2079] audit: op="connection-add" uuid="b4337e7e-ca23-492c-ab48-f055932ee3db" name="vlan21-if" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2093] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2095] audit: op="connection-add" uuid="53fd2b61-fbc0-48f4-80ae-6971050da4ef" name="vlan22-if" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2107] audit: op="connection-delete" uuid="c0f7c555-02e9-39b7-af48-41e7866f30b4" name="Wired connection 1" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2118] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2127] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2130] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (71e14b9d-d2e9-4290-adca-bf2540856e0a)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2131] audit: op="connection-activate" uuid="71e14b9d-d2e9-4290-adca-bf2540856e0a" name="br-ex-br" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2132] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2137] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2140] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (7024ae72-caf0-4d68-8395-0af1e561ffb0)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2142] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2146] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2149] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (862795d1-29ea-47b4-acdb-5ef5960dfcf5)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2151] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2156] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2159] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (e8a8f147-70c7-4754-bb3e-3865bc1ec747)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2161] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2165] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2168] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (4bc6041b-97d0-460b-82f5-faa3ff37be16)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2170] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2175] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2178] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (2971fba7-c9d9-44da-9eca-31a2c85797ad)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2178] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2180] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2182] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2187] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2190] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2194] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (57be861f-4893-41bf-981d-342a6f993f0c)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2195] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2198] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2199] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2200] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2201] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2209] device (eth1): disconnecting for new activation request.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2210] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2212] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2213] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2214] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2217] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2221] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2224] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (e926046b-4bb7-4a2b-99be-c37f0dabb0fa)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2225] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2227] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2229] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2230] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2232] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2236] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2239] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (b4337e7e-ca23-492c-ab48-f055932ee3db)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2240] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2242] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2243] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2244] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2246] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2250] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2252] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (53fd2b61-fbc0-48f4-80ae-6971050da4ef)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2253] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2255] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2257] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2258] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2259] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2269] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.addr-gen-mode,ipv6.method" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2270] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2272] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2274] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2279] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2283] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2286] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2289] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2290] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2294] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 kernel: ovs-system: entered promiscuous mode
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2312] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2319] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2322] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 kernel: Timeout policy base is empty
Nov 24 14:12:49 compute-0 systemd-udevd[58479]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2330] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2336] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2340] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2342] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2350] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2356] dhcp4 (eth0): canceled DHCP transaction
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2356] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2357] dhcp4 (eth0): state changed no lease
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2359] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2375] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2383] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58473 uid=0 result="fail" reason="Device is not activated"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2388] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2394] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 24 14:12:49 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 24 14:12:49 compute-0 kernel: br-ex: entered promiscuous mode
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2665] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 24 14:12:49 compute-0 kernel: vlan22: entered promiscuous mode
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2687] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 24 14:12:49 compute-0 systemd-udevd[58477]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:12:49 compute-0 kernel: vlan21: entered promiscuous mode
Nov 24 14:12:49 compute-0 kernel: vlan20: entered promiscuous mode
Nov 24 14:12:49 compute-0 systemd-udevd[58478]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2814] device (eth1): Activation: starting connection 'ci-private-network' (7049c056-21f5-55fc-906e-9890c70fc7c7)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2818] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2819] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2820] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2821] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2822] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2823] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2824] device (eth1): state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2831] device (eth1): disconnecting for new activation request.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2832] audit: op="connection-activate" uuid="7049c056-21f5-55fc-906e-9890c70fc7c7" name="ci-private-network" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2844] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2849] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2853] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2861] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2864] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2875] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2878] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2883] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2888] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2892] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2896] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2900] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2904] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2911] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2938] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2939] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58473 uid=0 result="success"
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2939] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2949] device (eth1): Activation: starting connection 'ci-private-network' (7049c056-21f5-55fc-906e-9890c70fc7c7)
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2955] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2963] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2967] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2988] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.2997] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3002] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3016] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3019] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3026] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3034] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3039] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3043] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3048] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3053] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3054] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3058] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3062] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3067] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3074] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3080] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3082] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3087] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3092] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3095] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.3100] device (eth1): Activation: successful, device activated.
Nov 24 14:12:49 compute-0 NetworkManager[55697]: <info>  [1763993569.4426] dhcp4 (eth0): state changed new lease, address=38.102.83.214
Nov 24 14:12:50 compute-0 NetworkManager[55697]: <info>  [1763993570.4378] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58473 uid=0 result="success"
Nov 24 14:12:50 compute-0 NetworkManager[55697]: <info>  [1763993570.5924] checkpoint[0x556958cc5950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 24 14:12:50 compute-0 NetworkManager[55697]: <info>  [1763993570.5926] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58473 uid=0 result="success"
Nov 24 14:12:50 compute-0 NetworkManager[55697]: <info>  [1763993570.8424] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58473 uid=0 result="success"
Nov 24 14:12:50 compute-0 NetworkManager[55697]: <info>  [1763993570.8434] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58473 uid=0 result="success"
Nov 24 14:12:50 compute-0 sudo[58811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weymqlnvgfptloroqaxllwkufdhminkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993570.5038116-295-223308898379541/AnsiballZ_async_status.py'
Nov 24 14:12:50 compute-0 sudo[58811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:51 compute-0 NetworkManager[55697]: <info>  [1763993571.1116] audit: op="networking-control" arg="global-dns-configuration" pid=58473 uid=0 result="success"
Nov 24 14:12:51 compute-0 NetworkManager[55697]: <info>  [1763993571.1156] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 24 14:12:51 compute-0 NetworkManager[55697]: <info>  [1763993571.1194] audit: op="networking-control" arg="global-dns-configuration" pid=58473 uid=0 result="success"
Nov 24 14:12:51 compute-0 NetworkManager[55697]: <info>  [1763993571.1238] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58473 uid=0 result="success"
Nov 24 14:12:51 compute-0 python3.9[58813]: ansible-ansible.legacy.async_status Invoked with jid=j24563145645.58467 mode=status _async_dir=/root/.ansible_async
Nov 24 14:12:51 compute-0 sudo[58811]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:51 compute-0 NetworkManager[55697]: <info>  [1763993571.3002] checkpoint[0x556958cc5a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 24 14:12:51 compute-0 NetworkManager[55697]: <info>  [1763993571.3007] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58473 uid=0 result="success"
Nov 24 14:12:51 compute-0 ansible-async_wrapper.py[58471]: Module complete (58471)
Nov 24 14:12:52 compute-0 ansible-async_wrapper.py[58470]: Done in kid B.
Nov 24 14:12:54 compute-0 sudo[58915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eylxgujahxbydspgzugmlliptdwpusvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993570.5038116-295-223308898379541/AnsiballZ_async_status.py'
Nov 24 14:12:54 compute-0 sudo[58915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:54 compute-0 python3.9[58917]: ansible-ansible.legacy.async_status Invoked with jid=j24563145645.58467 mode=status _async_dir=/root/.ansible_async
Nov 24 14:12:54 compute-0 sudo[58915]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:54 compute-0 sudo[59015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urjihywmngztxwhrdwxpnlhqrgsxgbey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993570.5038116-295-223308898379541/AnsiballZ_async_status.py'
Nov 24 14:12:54 compute-0 sudo[59015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:55 compute-0 python3.9[59017]: ansible-ansible.legacy.async_status Invoked with jid=j24563145645.58467 mode=cleanup _async_dir=/root/.ansible_async
Nov 24 14:12:55 compute-0 sudo[59015]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:55 compute-0 sudo[59167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbitscrzcqxtqjdmxymrcjfbzpfhlwka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993575.2420664-322-117732294502043/AnsiballZ_stat.py'
Nov 24 14:12:55 compute-0 sudo[59167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:55 compute-0 python3.9[59169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:12:55 compute-0 sudo[59167]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:55 compute-0 sudo[59290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiduqhszlyhytgvscmglhxuldwxyfzls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993575.2420664-322-117732294502043/AnsiballZ_copy.py'
Nov 24 14:12:55 compute-0 sudo[59290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:56 compute-0 python3.9[59292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993575.2420664-322-117732294502043/.source.returncode _original_basename=.8hg9l1o6 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:56 compute-0 sudo[59290]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:56 compute-0 sudo[59442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hggeyoryrkglpzggyybzwdydiuqktlnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993576.4495056-338-114407399406229/AnsiballZ_stat.py'
Nov 24 14:12:56 compute-0 sudo[59442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:56 compute-0 python3.9[59444]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:12:56 compute-0 sudo[59442]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:57 compute-0 sudo[59565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xujpdrevkamvguisbrwhaffiiismmzbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993576.4495056-338-114407399406229/AnsiballZ_copy.py'
Nov 24 14:12:57 compute-0 sudo[59565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:57 compute-0 python3.9[59567]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993576.4495056-338-114407399406229/.source.cfg _original_basename=.k0pln_ga follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:12:57 compute-0 sudo[59565]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:57 compute-0 sudo[59718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axwvrhsbtednhgjkbiozywjfnvstqwwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993577.5556912-353-109813356904492/AnsiballZ_systemd.py'
Nov 24 14:12:57 compute-0 sudo[59718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:12:58 compute-0 python3.9[59720]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:12:58 compute-0 systemd[1]: Reloading Network Manager...
Nov 24 14:12:58 compute-0 NetworkManager[55697]: <info>  [1763993578.2186] audit: op="reload" arg="0" pid=59724 uid=0 result="success"
Nov 24 14:12:58 compute-0 NetworkManager[55697]: <info>  [1763993578.2196] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 24 14:12:58 compute-0 systemd[1]: Reloaded Network Manager.
Nov 24 14:12:58 compute-0 sudo[59718]: pam_unix(sudo:session): session closed for user root
Nov 24 14:12:58 compute-0 sshd-session[51704]: Connection closed by 192.168.122.30 port 55658
Nov 24 14:12:58 compute-0 sshd-session[51701]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:12:58 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Nov 24 14:12:58 compute-0 systemd[1]: session-12.scope: Consumed 48.254s CPU time.
Nov 24 14:12:58 compute-0 systemd-logind[807]: Session 12 logged out. Waiting for processes to exit.
Nov 24 14:12:58 compute-0 systemd-logind[807]: Removed session 12.
Nov 24 14:12:59 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 14:13:04 compute-0 sshd-session[59757]: Accepted publickey for zuul from 192.168.122.30 port 49460 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:13:04 compute-0 systemd-logind[807]: New session 13 of user zuul.
Nov 24 14:13:04 compute-0 systemd[1]: Started Session 13 of User zuul.
Nov 24 14:13:04 compute-0 sshd-session[59757]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:13:05 compute-0 python3.9[59910]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:13:06 compute-0 python3.9[60064]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:13:07 compute-0 python3.9[60254]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:13:08 compute-0 sshd-session[59760]: Connection closed by 192.168.122.30 port 49460
Nov 24 14:13:08 compute-0 sshd-session[59757]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:13:08 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Nov 24 14:13:08 compute-0 systemd[1]: session-13.scope: Consumed 2.339s CPU time.
Nov 24 14:13:08 compute-0 systemd-logind[807]: Session 13 logged out. Waiting for processes to exit.
Nov 24 14:13:08 compute-0 systemd-logind[807]: Removed session 13.
Nov 24 14:13:08 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 24 14:13:13 compute-0 sshd-session[60282]: Accepted publickey for zuul from 192.168.122.30 port 35530 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:13:13 compute-0 systemd-logind[807]: New session 14 of user zuul.
Nov 24 14:13:13 compute-0 systemd[1]: Started Session 14 of User zuul.
Nov 24 14:13:13 compute-0 sshd-session[60282]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:13:14 compute-0 python3.9[60436]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:13:15 compute-0 python3.9[60590]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:13:16 compute-0 sudo[60744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqzureuibskyshuserwowhpmdweldvzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993595.7899394-40-101017502305940/AnsiballZ_setup.py'
Nov 24 14:13:16 compute-0 sudo[60744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:16 compute-0 python3.9[60746]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:13:16 compute-0 sudo[60744]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:17 compute-0 sudo[60829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmivsgywoggonrusqibbekqpdsjsazzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993595.7899394-40-101017502305940/AnsiballZ_dnf.py'
Nov 24 14:13:17 compute-0 sudo[60829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:17 compute-0 python3.9[60831]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:13:18 compute-0 sudo[60829]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:19 compute-0 sudo[60982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmnmaloqbstqvwwecqkjzqrhwdsulwtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993598.767303-52-5773644248316/AnsiballZ_setup.py'
Nov 24 14:13:19 compute-0 sudo[60982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:19 compute-0 python3.9[60984]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:13:19 compute-0 sudo[60982]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:20 compute-0 sudo[61174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbseioaxrcsvgrsyhizfggngyesipbau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993599.8169081-63-237227101743956/AnsiballZ_file.py'
Nov 24 14:13:20 compute-0 sudo[61174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:20 compute-0 python3.9[61176]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:20 compute-0 sudo[61174]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:20 compute-0 sudo[61326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlhiojrqabvqogroouedbfwihdwzwsqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993600.5660071-71-138285649932715/AnsiballZ_command.py'
Nov 24 14:13:20 compute-0 sudo[61326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:21 compute-0 python3.9[61328]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:13:21 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:13:21 compute-0 sudo[61326]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:21 compute-0 sudo[61489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxguwrsscsyvipttzygtkrkbvtlbswir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993601.3620148-79-211624292396087/AnsiballZ_stat.py'
Nov 24 14:13:21 compute-0 sudo[61489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:21 compute-0 python3.9[61491]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:13:22 compute-0 sudo[61489]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:22 compute-0 sudo[61567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grjprvxkvwgkeqvbmgurqncxcjqdkkxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993601.3620148-79-211624292396087/AnsiballZ_file.py'
Nov 24 14:13:22 compute-0 sudo[61567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:22 compute-0 python3.9[61569]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:22 compute-0 sudo[61567]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:22 compute-0 sudo[61719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltnoqcfttrxgjcgpxirmmczajldrvqeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993602.585148-91-177740112599489/AnsiballZ_stat.py'
Nov 24 14:13:22 compute-0 sudo[61719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:23 compute-0 python3.9[61721]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:13:23 compute-0 sudo[61719]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:23 compute-0 sudo[61797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uozywpdefdopfvjzuvsslrmaxejivuzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993602.585148-91-177740112599489/AnsiballZ_file.py'
Nov 24 14:13:23 compute-0 sudo[61797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:23 compute-0 python3.9[61799]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:13:23 compute-0 sudo[61797]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:24 compute-0 sudo[61949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwdayrmugujclblczlljjlzzfciawcno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993603.7485962-104-263610670754817/AnsiballZ_ini_file.py'
Nov 24 14:13:24 compute-0 sudo[61949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:24 compute-0 python3.9[61951]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:13:24 compute-0 sudo[61949]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:24 compute-0 sudo[62101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zexhlropyrddvspnxvmtmubriwmmnilw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993604.4257753-104-260217541822483/AnsiballZ_ini_file.py'
Nov 24 14:13:24 compute-0 sudo[62101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:24 compute-0 python3.9[62103]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:13:24 compute-0 sudo[62101]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:25 compute-0 sudo[62253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvslpywtqdhqsegciryfofwgotvjoiiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993604.9862194-104-222016207165618/AnsiballZ_ini_file.py'
Nov 24 14:13:25 compute-0 sudo[62253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:25 compute-0 python3.9[62255]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:13:25 compute-0 sudo[62253]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:25 compute-0 sudo[62405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjfghwrezdrlyfazhhtdygjwboqgosob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993605.5391755-104-26121157437179/AnsiballZ_ini_file.py'
Nov 24 14:13:25 compute-0 sudo[62405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:25 compute-0 python3.9[62407]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:13:25 compute-0 sudo[62405]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:26 compute-0 sudo[62557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvylkqfhsmwtmkvohqvcmfmjzaeigmcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993606.171691-135-30043896174777/AnsiballZ_dnf.py'
Nov 24 14:13:26 compute-0 sudo[62557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:26 compute-0 python3.9[62559]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:13:28 compute-0 sudo[62557]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:28 compute-0 sudo[62710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yniekuxvohfhiykahcaxywjqfibsqpvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993608.4052336-146-59041703583588/AnsiballZ_setup.py'
Nov 24 14:13:28 compute-0 sudo[62710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:28 compute-0 python3.9[62712]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:13:29 compute-0 sudo[62710]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:29 compute-0 sudo[62864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jckitsanicsvnrebzgnqwykcmdhtzlaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993609.20194-154-200814760813780/AnsiballZ_stat.py'
Nov 24 14:13:29 compute-0 sudo[62864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:29 compute-0 python3.9[62866]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:13:29 compute-0 sudo[62864]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:30 compute-0 sudo[63016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfwrmlfdwwhfltvcmczfoqmhznravjjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993609.8336234-163-262535202636801/AnsiballZ_stat.py'
Nov 24 14:13:30 compute-0 sudo[63016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:30 compute-0 python3.9[63018]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:13:30 compute-0 sudo[63016]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:30 compute-0 sudo[63168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foslksejtxsvfovckyglzqvvlaovapvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993610.5050406-173-225931015734375/AnsiballZ_command.py'
Nov 24 14:13:30 compute-0 sudo[63168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:30 compute-0 python3.9[63170]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:13:30 compute-0 sudo[63168]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:31 compute-0 sudo[63321]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcrepsshhzivyczbrqiwqnjmxaukqkrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993611.183599-183-59855815439456/AnsiballZ_service_facts.py'
Nov 24 14:13:31 compute-0 sudo[63321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:31 compute-0 python3.9[63323]: ansible-service_facts Invoked
Nov 24 14:13:31 compute-0 network[63340]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:13:31 compute-0 network[63341]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:13:31 compute-0 network[63342]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:13:34 compute-0 sudo[63321]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:35 compute-0 sudo[63625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpofgndwludhgtcrsrtqzhdsiyspooyu ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1763993614.9417427-198-138527198860374/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1763993614.9417427-198-138527198860374/args'
Nov 24 14:13:35 compute-0 sudo[63625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:35 compute-0 sudo[63625]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:35 compute-0 sudo[63792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcjvmlizoboafzfudnmybczeisloiqqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993615.7353325-209-64571360618886/AnsiballZ_dnf.py'
Nov 24 14:13:36 compute-0 sudo[63792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:36 compute-0 python3.9[63794]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:13:37 compute-0 sudo[63792]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:38 compute-0 sudo[63945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ridsbhlflemckuwsxvzcukehylzuecqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993617.803502-222-3536162733114/AnsiballZ_package_facts.py'
Nov 24 14:13:38 compute-0 sudo[63945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:38 compute-0 python3.9[63947]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 24 14:13:38 compute-0 sudo[63945]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:39 compute-0 sudo[64097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijfsypmodsrebcouqxjosysswqlpmqvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993619.2535605-232-194492526320199/AnsiballZ_stat.py'
Nov 24 14:13:39 compute-0 sudo[64097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:39 compute-0 python3.9[64099]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:13:39 compute-0 sudo[64097]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:40 compute-0 sudo[64222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwllyfhqeuclhjdjsxcsjzamswsfxlix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993619.2535605-232-194492526320199/AnsiballZ_copy.py'
Nov 24 14:13:40 compute-0 sudo[64222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:40 compute-0 python3.9[64224]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993619.2535605-232-194492526320199/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:40 compute-0 sudo[64222]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:41 compute-0 sudo[64376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjxlqityngfotvtxhihglmnbqjpzlxcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993620.7648776-247-123381963359431/AnsiballZ_stat.py'
Nov 24 14:13:41 compute-0 sudo[64376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:41 compute-0 python3.9[64378]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:13:42 compute-0 sudo[64376]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:42 compute-0 sudo[64501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piramoofyufufsfmjduqtgkeelebyxsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993620.7648776-247-123381963359431/AnsiballZ_copy.py'
Nov 24 14:13:42 compute-0 sudo[64501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:42 compute-0 python3.9[64503]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993620.7648776-247-123381963359431/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:42 compute-0 sudo[64501]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:43 compute-0 sudo[64655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkjcufimgtpkmdppfflnguknkkzgkhsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993623.2478104-268-45437768819159/AnsiballZ_lineinfile.py'
Nov 24 14:13:43 compute-0 sudo[64655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:43 compute-0 python3.9[64657]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:43 compute-0 sudo[64655]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:44 compute-0 sudo[64809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfqkjtcquuxbkcbaroqgpkbxjawxetmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993624.4698277-283-216083253122114/AnsiballZ_setup.py'
Nov 24 14:13:44 compute-0 sudo[64809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:45 compute-0 python3.9[64811]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:13:45 compute-0 sudo[64809]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:45 compute-0 sudo[64893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdtslpjwoxheysiuvhpzubgmqpcwueaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993624.4698277-283-216083253122114/AnsiballZ_systemd.py'
Nov 24 14:13:45 compute-0 sudo[64893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:46 compute-0 python3.9[64895]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:13:46 compute-0 sudo[64893]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:46 compute-0 sudo[65047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aseddskapdzzqyxtxiwechaluysmgthl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993626.665123-299-142990355506957/AnsiballZ_setup.py'
Nov 24 14:13:46 compute-0 sudo[65047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:47 compute-0 python3.9[65049]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:13:47 compute-0 sudo[65047]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:47 compute-0 sudo[65131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmkgzcwkeuuhxdyspftxhzwvuxljeaka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993626.665123-299-142990355506957/AnsiballZ_systemd.py'
Nov 24 14:13:47 compute-0 sudo[65131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:48 compute-0 python3.9[65133]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:13:48 compute-0 chronyd[792]: chronyd exiting
Nov 24 14:13:48 compute-0 systemd[1]: Stopping NTP client/server...
Nov 24 14:13:48 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Nov 24 14:13:48 compute-0 systemd[1]: Stopped NTP client/server.
Nov 24 14:13:48 compute-0 systemd[1]: Starting NTP client/server...
Nov 24 14:13:48 compute-0 chronyd[65141]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 24 14:13:48 compute-0 chronyd[65141]: Frequency -26.776 +/- 0.076 ppm read from /var/lib/chrony/drift
Nov 24 14:13:48 compute-0 chronyd[65141]: Loaded seccomp filter (level 2)
Nov 24 14:13:48 compute-0 systemd[1]: Started NTP client/server.
Nov 24 14:13:48 compute-0 sudo[65131]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:48 compute-0 sshd-session[60285]: Connection closed by 192.168.122.30 port 35530
Nov 24 14:13:48 compute-0 sshd-session[60282]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:13:48 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Nov 24 14:13:48 compute-0 systemd[1]: session-14.scope: Consumed 24.202s CPU time.
Nov 24 14:13:48 compute-0 systemd-logind[807]: Session 14 logged out. Waiting for processes to exit.
Nov 24 14:13:48 compute-0 systemd-logind[807]: Removed session 14.
Nov 24 14:13:54 compute-0 sshd-session[65167]: Accepted publickey for zuul from 192.168.122.30 port 48166 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:13:54 compute-0 systemd-logind[807]: New session 15 of user zuul.
Nov 24 14:13:54 compute-0 systemd[1]: Started Session 15 of User zuul.
Nov 24 14:13:54 compute-0 sshd-session[65167]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:13:55 compute-0 python3.9[65320]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:13:56 compute-0 sudo[65474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywofzgrnpzghkzjkchjjllzqnsqlfzig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993635.8433197-33-72590193590199/AnsiballZ_file.py'
Nov 24 14:13:56 compute-0 sudo[65474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:56 compute-0 python3.9[65476]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:56 compute-0 sudo[65474]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:56 compute-0 sudo[65649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wykdsaetfhzldqcpjzoctgssejmmzcqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993636.5697157-41-81520888597667/AnsiballZ_stat.py'
Nov 24 14:13:56 compute-0 sudo[65649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:57 compute-0 python3.9[65651]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:13:57 compute-0 sudo[65649]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:57 compute-0 sudo[65727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zefzfmatgjdwlyjciawvmwwgsdeawexp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993636.5697157-41-81520888597667/AnsiballZ_file.py'
Nov 24 14:13:57 compute-0 sudo[65727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:57 compute-0 python3.9[65729]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.a0fyu8pg recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:57 compute-0 sudo[65727]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:58 compute-0 sudo[65879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcfenwuvkybvmrssonjxtnbyvqyzxrrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993637.9030328-61-194290600528816/AnsiballZ_stat.py'
Nov 24 14:13:58 compute-0 sudo[65879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:58 compute-0 python3.9[65881]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:13:58 compute-0 sudo[65879]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:58 compute-0 sudo[66002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldrjkafyxwhraahpcwpfijiehduwpaib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993637.9030328-61-194290600528816/AnsiballZ_copy.py'
Nov 24 14:13:58 compute-0 sudo[66002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:59 compute-0 python3.9[66004]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993637.9030328-61-194290600528816/.source _original_basename=.8gymv94z follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:13:59 compute-0 sudo[66002]: pam_unix(sudo:session): session closed for user root
Nov 24 14:13:59 compute-0 sudo[66154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ioaobkkxvzmhcuzhtmgajoyovcmpzqcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993639.198328-77-117541061834192/AnsiballZ_file.py'
Nov 24 14:13:59 compute-0 sudo[66154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:13:59 compute-0 python3.9[66156]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:13:59 compute-0 sudo[66154]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:00 compute-0 sudo[66306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmuqvhkzvltxmxepgjplnnguavzmcedr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993639.781845-85-264334884320892/AnsiballZ_stat.py'
Nov 24 14:14:00 compute-0 sudo[66306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:00 compute-0 python3.9[66308]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:00 compute-0 sudo[66306]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:00 compute-0 sudo[66430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irocvjtreuxzpxhhlzykuqgfvpkrpxec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993639.781845-85-264334884320892/AnsiballZ_copy.py'
Nov 24 14:14:00 compute-0 sudo[66430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:00 compute-0 python3.9[66432]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993639.781845-85-264334884320892/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:14:00 compute-0 sudo[66430]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:01 compute-0 sudo[66582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rejaxclhcpludhxdoyocdrfzgofzailr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993640.935051-85-106460593031889/AnsiballZ_stat.py'
Nov 24 14:14:01 compute-0 sudo[66582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:01 compute-0 python3.9[66584]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:01 compute-0 sudo[66582]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:01 compute-0 sudo[66706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stjcbzsvjtxpbjbllrlsiiefpfkjpelb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993640.935051-85-106460593031889/AnsiballZ_copy.py'
Nov 24 14:14:01 compute-0 sudo[66706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:01 compute-0 python3.9[66708]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993640.935051-85-106460593031889/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:14:01 compute-0 sudo[66706]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:02 compute-0 sshd-session[66332]: Invalid user ubnt from 185.156.73.233 port 23992
Nov 24 14:14:02 compute-0 sshd-session[66332]: Connection closed by invalid user ubnt 185.156.73.233 port 23992 [preauth]
Nov 24 14:14:02 compute-0 sudo[66858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzqctjuthnlrealqapdmawjufxrarawf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993642.1334584-114-113808238733866/AnsiballZ_file.py'
Nov 24 14:14:02 compute-0 sudo[66858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:02 compute-0 python3.9[66860]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:02 compute-0 sudo[66858]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:03 compute-0 sudo[67010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlujmzjbleiuavhsuiuchiobqsiyjuem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993642.7536464-122-168200689687571/AnsiballZ_stat.py'
Nov 24 14:14:03 compute-0 sudo[67010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:03 compute-0 python3.9[67012]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:03 compute-0 sudo[67010]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:03 compute-0 sudo[67133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bupcblqplokhvdozlypdngdvkshaacdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993642.7536464-122-168200689687571/AnsiballZ_copy.py'
Nov 24 14:14:03 compute-0 sudo[67133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:03 compute-0 python3.9[67135]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993642.7536464-122-168200689687571/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:03 compute-0 sudo[67133]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:04 compute-0 sudo[67285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-croptwurgqvzmmdbilfcbxtdzqdgidmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993643.9415183-137-179019050577641/AnsiballZ_stat.py'
Nov 24 14:14:04 compute-0 sudo[67285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:04 compute-0 python3.9[67287]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:04 compute-0 sudo[67285]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:04 compute-0 sudo[67408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prwdlzhvnusuqchkiqifhwsfdxogelnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993643.9415183-137-179019050577641/AnsiballZ_copy.py'
Nov 24 14:14:04 compute-0 sudo[67408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:04 compute-0 python3.9[67410]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993643.9415183-137-179019050577641/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:04 compute-0 sudo[67408]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:05 compute-0 sudo[67560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anjgzcehqsduowtwlrsnygjrbgwfghiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993645.0520246-152-41311768770706/AnsiballZ_systemd.py'
Nov 24 14:14:05 compute-0 sudo[67560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:05 compute-0 python3.9[67562]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:14:05 compute-0 systemd[1]: Reloading.
Nov 24 14:14:05 compute-0 systemd-sysv-generator[67591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:14:05 compute-0 systemd-rc-local-generator[67584]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:14:06 compute-0 systemd[1]: Reloading.
Nov 24 14:14:06 compute-0 systemd-sysv-generator[67632]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:14:06 compute-0 systemd-rc-local-generator[67629]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:14:06 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Nov 24 14:14:06 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Nov 24 14:14:06 compute-0 sudo[67560]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:06 compute-0 sudo[67788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzjuzxukoglazyovtnwgfbqwustpadvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993646.5476072-160-132112578274232/AnsiballZ_stat.py'
Nov 24 14:14:06 compute-0 sudo[67788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:06 compute-0 python3.9[67790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:06 compute-0 sudo[67788]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:07 compute-0 sudo[67911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkoenqtwijmkodndtfkuzbppgkbezsly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993646.5476072-160-132112578274232/AnsiballZ_copy.py'
Nov 24 14:14:07 compute-0 sudo[67911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:07 compute-0 python3.9[67913]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993646.5476072-160-132112578274232/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:07 compute-0 sudo[67911]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:07 compute-0 sudo[68063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jckpuguroqziqzujrhndckbfuvabbgco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993647.6743667-175-64443517311283/AnsiballZ_stat.py'
Nov 24 14:14:07 compute-0 sudo[68063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:08 compute-0 python3.9[68065]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:08 compute-0 sudo[68063]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:08 compute-0 sudo[68186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkvygosyupdduplylxvjlockhxmpkwtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993647.6743667-175-64443517311283/AnsiballZ_copy.py'
Nov 24 14:14:08 compute-0 sudo[68186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:08 compute-0 python3.9[68188]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993647.6743667-175-64443517311283/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:08 compute-0 sudo[68186]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:09 compute-0 sudo[68338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuehhlchfebjdsigpsytsvurngdlkecd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993648.7553701-190-76493232697319/AnsiballZ_systemd.py'
Nov 24 14:14:09 compute-0 sudo[68338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:09 compute-0 python3.9[68340]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:14:09 compute-0 systemd[1]: Reloading.
Nov 24 14:14:09 compute-0 systemd-rc-local-generator[68366]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:14:09 compute-0 systemd-sysv-generator[68370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:14:09 compute-0 systemd[1]: Reloading.
Nov 24 14:14:09 compute-0 systemd-rc-local-generator[68404]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:14:09 compute-0 systemd-sysv-generator[68408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:14:09 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 14:14:09 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 14:14:09 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 14:14:09 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 14:14:09 compute-0 sudo[68338]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:11 compute-0 python3.9[68565]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:14:11 compute-0 network[68582]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:14:11 compute-0 network[68583]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:14:11 compute-0 network[68584]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:14:13 compute-0 sudo[68844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dooendodkmckpmsidswrfcvvxdhejnqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993653.691524-206-90510139526831/AnsiballZ_systemd.py'
Nov 24 14:14:13 compute-0 sudo[68844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:14 compute-0 python3.9[68846]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:14:14 compute-0 systemd[1]: Reloading.
Nov 24 14:14:14 compute-0 systemd-rc-local-generator[68876]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:14:14 compute-0 systemd-sysv-generator[68880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:14:14 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 24 14:14:14 compute-0 iptables.init[68887]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 24 14:14:14 compute-0 iptables.init[68887]: iptables: Flushing firewall rules: [  OK  ]
Nov 24 14:14:14 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Nov 24 14:14:14 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 24 14:14:14 compute-0 sudo[68844]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:15 compute-0 sudo[69081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zeukrururjmeezldivmbwylduaskfzio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993655.0749264-206-88143884042581/AnsiballZ_systemd.py'
Nov 24 14:14:15 compute-0 sudo[69081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:15 compute-0 python3.9[69083]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:14:15 compute-0 sudo[69081]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:16 compute-0 sudo[69235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrphtmhmiwrjyatnrgmbaeikiqwhgwtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993655.986994-222-113131315277590/AnsiballZ_systemd.py'
Nov 24 14:14:16 compute-0 sudo[69235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:16 compute-0 python3.9[69237]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:14:16 compute-0 systemd[1]: Reloading.
Nov 24 14:14:16 compute-0 systemd-sysv-generator[69268]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:14:16 compute-0 systemd-rc-local-generator[69265]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:14:16 compute-0 systemd[1]: Starting Netfilter Tables...
Nov 24 14:14:16 compute-0 systemd[1]: Finished Netfilter Tables.
Nov 24 14:14:17 compute-0 sudo[69235]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:17 compute-0 sudo[69428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-netwkxnumfngayldekcjhccnqxvfmjxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993657.2117171-230-264577978923478/AnsiballZ_command.py'
Nov 24 14:14:17 compute-0 sudo[69428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:17 compute-0 python3.9[69430]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:14:17 compute-0 sudo[69428]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:18 compute-0 sudo[69581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeamjkfgzrircjuhrlihkxuduoanhigz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993658.2458594-244-117528730437044/AnsiballZ_stat.py'
Nov 24 14:14:18 compute-0 sudo[69581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:18 compute-0 python3.9[69583]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:18 compute-0 sudo[69581]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:19 compute-0 sudo[69706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cefivonwxtvvkrtgoxifnhrogzrjddvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993658.2458594-244-117528730437044/AnsiballZ_copy.py'
Nov 24 14:14:19 compute-0 sudo[69706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:19 compute-0 python3.9[69708]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993658.2458594-244-117528730437044/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:19 compute-0 sudo[69706]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:19 compute-0 sudo[69859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkieeeptyqgrmqxpyscsrcqugdptatru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993659.5284848-259-263456193966342/AnsiballZ_systemd.py'
Nov 24 14:14:19 compute-0 sudo[69859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:20 compute-0 python3.9[69861]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:14:20 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Nov 24 14:14:20 compute-0 sshd[1006]: Received SIGHUP; restarting.
Nov 24 14:14:20 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Nov 24 14:14:20 compute-0 sshd[1006]: Server listening on 0.0.0.0 port 22.
Nov 24 14:14:20 compute-0 sshd[1006]: Server listening on :: port 22.
Nov 24 14:14:20 compute-0 sudo[69859]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:20 compute-0 sudo[70015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idbnxkqjeyerawbqgxrkblpkmcybzfgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993660.455335-267-65747723990176/AnsiballZ_file.py'
Nov 24 14:14:20 compute-0 sudo[70015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:20 compute-0 python3.9[70017]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:21 compute-0 sudo[70015]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:21 compute-0 sudo[70167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjxfmxlhcpidlfzrekuvofevxrpfoxmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993661.2834265-275-42574296204428/AnsiballZ_stat.py'
Nov 24 14:14:21 compute-0 sudo[70167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:21 compute-0 python3.9[70169]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:21 compute-0 sudo[70167]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:22 compute-0 sudo[70290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgasqrgjvnxbedacpzkzmcxpugpgrvyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993661.2834265-275-42574296204428/AnsiballZ_copy.py'
Nov 24 14:14:22 compute-0 sudo[70290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:22 compute-0 python3.9[70292]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993661.2834265-275-42574296204428/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:22 compute-0 sudo[70290]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:23 compute-0 sudo[70442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hthncryokvlxnhhqanzfjssyrqfhgfat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993662.6647246-293-175021849989861/AnsiballZ_timezone.py'
Nov 24 14:14:23 compute-0 sudo[70442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:23 compute-0 python3.9[70444]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 24 14:14:23 compute-0 systemd[1]: Starting Time & Date Service...
Nov 24 14:14:23 compute-0 systemd[1]: Started Time & Date Service.
Nov 24 14:14:23 compute-0 sudo[70442]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:23 compute-0 sudo[70598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aczofvwvpnuqygjddbwurbvbgtzqyyjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993663.6445453-302-219475760568315/AnsiballZ_file.py'
Nov 24 14:14:23 compute-0 sudo[70598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:24 compute-0 python3.9[70600]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:24 compute-0 sudo[70598]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:24 compute-0 sudo[70750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyexvmqvmtgexivucgrjnhhjfazjkswc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993664.300496-310-216725756294929/AnsiballZ_stat.py'
Nov 24 14:14:24 compute-0 sudo[70750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:24 compute-0 python3.9[70752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:24 compute-0 sudo[70750]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:25 compute-0 sudo[70873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxzngccioqkvyzdkedfvhpzkbmqwrpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993664.300496-310-216725756294929/AnsiballZ_copy.py'
Nov 24 14:14:25 compute-0 sudo[70873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:25 compute-0 python3.9[70875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993664.300496-310-216725756294929/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:25 compute-0 sudo[70873]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:25 compute-0 sudo[71025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcibfepetruzfgrmpdynvzvoyndwcifv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993665.5160332-325-194302015362949/AnsiballZ_stat.py'
Nov 24 14:14:25 compute-0 sudo[71025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:25 compute-0 python3.9[71027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:25 compute-0 sudo[71025]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:26 compute-0 sudo[71148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bopsoejmosxxfwvengtfnwhoruopgega ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993665.5160332-325-194302015362949/AnsiballZ_copy.py'
Nov 24 14:14:26 compute-0 sudo[71148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:26 compute-0 python3.9[71150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993665.5160332-325-194302015362949/.source.yaml _original_basename=.w5sch8gt follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:26 compute-0 sudo[71148]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:26 compute-0 sudo[71300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egotmvnbvmjrlrvfzkudnweiidjhasxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993666.695519-340-135579857193856/AnsiballZ_stat.py'
Nov 24 14:14:26 compute-0 sudo[71300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:27 compute-0 python3.9[71302]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:27 compute-0 sudo[71300]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:27 compute-0 sudo[71423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szuxejfknjuxqklcshgxiqrxlrexoksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993666.695519-340-135579857193856/AnsiballZ_copy.py'
Nov 24 14:14:27 compute-0 sudo[71423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:27 compute-0 python3.9[71425]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993666.695519-340-135579857193856/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:27 compute-0 sudo[71423]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:28 compute-0 sudo[71575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcdcbpreoosfytucuztnymtfycydilpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993667.801205-355-263320713484153/AnsiballZ_command.py'
Nov 24 14:14:28 compute-0 sudo[71575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:28 compute-0 python3.9[71577]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:14:28 compute-0 sudo[71575]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:28 compute-0 sudo[71728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpujnpzemwyemjtynlwyhvoxywwgxxik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993668.4085524-363-18237830882338/AnsiballZ_command.py'
Nov 24 14:14:28 compute-0 sudo[71728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:28 compute-0 python3.9[71730]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:14:28 compute-0 sudo[71728]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:29 compute-0 sudo[71881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mleamqsfwraukhouiuzwqlshzphbbvma ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763993669.1321812-371-235013877463340/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 14:14:29 compute-0 sudo[71881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:29 compute-0 python3[71883]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 14:14:29 compute-0 sudo[71881]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:30 compute-0 sudo[72033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptaclpbimskpcjhebeviletzrbebhkmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993669.9465404-379-230800651924789/AnsiballZ_stat.py'
Nov 24 14:14:30 compute-0 sudo[72033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:30 compute-0 python3.9[72035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:30 compute-0 sudo[72033]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:30 compute-0 sudo[72156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjemwdzzalyqjxajbmlusdnffdwwwumu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993669.9465404-379-230800651924789/AnsiballZ_copy.py'
Nov 24 14:14:30 compute-0 sudo[72156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:30 compute-0 python3.9[72158]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993669.9465404-379-230800651924789/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:30 compute-0 sudo[72156]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:31 compute-0 sudo[72308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsohaadoatxsnapoojxvfhkxxzioktdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993671.1295302-394-44516774357324/AnsiballZ_stat.py'
Nov 24 14:14:31 compute-0 sudo[72308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:31 compute-0 python3.9[72310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:31 compute-0 sudo[72308]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:31 compute-0 sudo[72431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqryooqugwohxuaupundpwlwkwhkdrzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993671.1295302-394-44516774357324/AnsiballZ_copy.py'
Nov 24 14:14:31 compute-0 sudo[72431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:32 compute-0 python3.9[72433]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993671.1295302-394-44516774357324/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:32 compute-0 sudo[72431]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:32 compute-0 sudo[72583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqqsboorbtqflguwkuxjqvomsgbvskkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993672.1949039-409-92074318455114/AnsiballZ_stat.py'
Nov 24 14:14:32 compute-0 sudo[72583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:32 compute-0 python3.9[72585]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:32 compute-0 sudo[72583]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:33 compute-0 sudo[72706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aagylfxcztlmxujejppykznikwjalhdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993672.1949039-409-92074318455114/AnsiballZ_copy.py'
Nov 24 14:14:33 compute-0 sudo[72706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:33 compute-0 python3.9[72708]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993672.1949039-409-92074318455114/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:33 compute-0 sudo[72706]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:33 compute-0 sudo[72858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqsreutpqvepdiswvisgosyjfbeadndq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993673.4556236-424-23199401443077/AnsiballZ_stat.py'
Nov 24 14:14:33 compute-0 sudo[72858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:33 compute-0 python3.9[72860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:33 compute-0 sudo[72858]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:34 compute-0 sudo[72981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sddgekbpxyysxryxuhevtpwtsisqgprm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993673.4556236-424-23199401443077/AnsiballZ_copy.py'
Nov 24 14:14:34 compute-0 sudo[72981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:34 compute-0 python3.9[72983]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993673.4556236-424-23199401443077/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:34 compute-0 sudo[72981]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:35 compute-0 sudo[73133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bepxqgratoasxencatnsbiokvmpwahlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993674.7212498-439-218160881558445/AnsiballZ_stat.py'
Nov 24 14:14:35 compute-0 sudo[73133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:35 compute-0 python3.9[73135]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:14:35 compute-0 sudo[73133]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:35 compute-0 sudo[73256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asrpaxnaabajgoewulrtqiqseflminri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993674.7212498-439-218160881558445/AnsiballZ_copy.py'
Nov 24 14:14:35 compute-0 sudo[73256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:35 compute-0 python3.9[73258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993674.7212498-439-218160881558445/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:35 compute-0 sudo[73256]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:36 compute-0 sudo[73408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oovvmejxofkatdebgweeonmakrrimcbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993676.1234365-454-218244246195249/AnsiballZ_file.py'
Nov 24 14:14:36 compute-0 sudo[73408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:36 compute-0 python3.9[73410]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:36 compute-0 sudo[73408]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:37 compute-0 sudo[73560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztobvhrtlmjjojnlsqqhzqerdeaqcppw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993676.798565-462-131069907105583/AnsiballZ_command.py'
Nov 24 14:14:37 compute-0 sudo[73560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:37 compute-0 python3.9[73562]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:14:37 compute-0 sudo[73560]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:37 compute-0 sudo[73719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdbrfdecfugaejoxjphttcxbbfydhxrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993677.5288901-470-135867602994264/AnsiballZ_blockinfile.py'
Nov 24 14:14:37 compute-0 sudo[73719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:38 compute-0 python3.9[73721]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:38 compute-0 sudo[73719]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:38 compute-0 sudo[73872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgchdpgjtmrfjvmfejdariiedweagufv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993678.4036386-479-239384329807035/AnsiballZ_file.py'
Nov 24 14:14:38 compute-0 sudo[73872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:38 compute-0 python3.9[73874]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:38 compute-0 sudo[73872]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:39 compute-0 sudo[74024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptpiimjdktmjtnwgbrovvxxtszjzranf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993679.0310283-479-85881117242010/AnsiballZ_file.py'
Nov 24 14:14:39 compute-0 sudo[74024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:39 compute-0 python3.9[74026]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:39 compute-0 sudo[74024]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:40 compute-0 sudo[74176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgbotpvfklbvyxzxfenktjhhludxnmes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993679.6750634-494-17695800115304/AnsiballZ_mount.py'
Nov 24 14:14:40 compute-0 sudo[74176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:40 compute-0 python3.9[74178]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 24 14:14:40 compute-0 sudo[74176]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:14:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:14:40 compute-0 sudo[74330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrhecsdrrbtngxtqmpvuuhwqpovgdbwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993680.5651128-494-72422134442397/AnsiballZ_mount.py'
Nov 24 14:14:40 compute-0 sudo[74330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:41 compute-0 python3.9[74332]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 24 14:14:41 compute-0 sudo[74330]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:41 compute-0 sshd-session[65170]: Connection closed by 192.168.122.30 port 48166
Nov 24 14:14:41 compute-0 sshd-session[65167]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:14:41 compute-0 systemd-logind[807]: Session 15 logged out. Waiting for processes to exit.
Nov 24 14:14:41 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Nov 24 14:14:41 compute-0 systemd[1]: session-15.scope: Consumed 34.063s CPU time.
Nov 24 14:14:41 compute-0 systemd-logind[807]: Removed session 15.
Nov 24 14:14:46 compute-0 sshd-session[74358]: Accepted publickey for zuul from 192.168.122.30 port 36304 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:14:46 compute-0 systemd-logind[807]: New session 16 of user zuul.
Nov 24 14:14:46 compute-0 systemd[1]: Started Session 16 of User zuul.
Nov 24 14:14:46 compute-0 sshd-session[74358]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:14:46 compute-0 sudo[74511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tedkpqtzngqujjokwvgexccfehlleebh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993686.4907148-16-75838122404491/AnsiballZ_tempfile.py'
Nov 24 14:14:46 compute-0 sudo[74511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:47 compute-0 python3.9[74513]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 24 14:14:47 compute-0 sudo[74511]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:47 compute-0 sudo[74663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azvtyrrpakuyiblxrhuwjjbkexdctiqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993687.2496607-28-61856442757051/AnsiballZ_stat.py'
Nov 24 14:14:47 compute-0 sudo[74663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:47 compute-0 python3.9[74665]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:14:47 compute-0 sudo[74663]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:48 compute-0 sudo[74815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgmwagiaudmbbebjyjferhjumhrrizfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993688.123855-38-55206971560172/AnsiballZ_setup.py'
Nov 24 14:14:48 compute-0 sudo[74815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:48 compute-0 python3.9[74817]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:14:49 compute-0 sudo[74815]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:49 compute-0 sudo[74967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pphjiszogszciccdfwyrlpvbylodcvbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993689.2182722-47-11674321971096/AnsiballZ_blockinfile.py'
Nov 24 14:14:49 compute-0 sudo[74967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:49 compute-0 python3.9[74969]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC6Og8JFaZB4pUPq+qPd4w09HZPBk4SA7XESj1DTUPWWJUdFGx3U9cYdKvKDy5yQKj82kIS9n+QI1ATT0Tbsmr1IaACdhCDqyEK6uoEb2TEzKqoZ08sKscdldGWVLJ24zSXIu/WKl2dkwvQFrzbUEKzkKyvB+pGHpKBqAfSMlgB/cbS1fqiAxiRXA6cRUYnJdTWM5IokIOvAqVB8SNskMrC9rj4RGWZm0ObtmTXiTbZew49x0YrNTjJY1tnd8PabLIm7IRpIZSAgueBDWUtumQRpVCz+SDcML8txhTObhdbg9/NlpST0jK3ftVICmzq6rsAXRtjotNsibj3HFOtWuClLxQxvjvy/f6blpY6psl0uShngtLl3DbGNBxdPMHnuUwpldUTwnSL0/iQy6/dzqAbwU4g+VivCC7nzaSIkJrkf4jjrt/o8gCp91Fus0yhNB8oM36pM/lj3NtCheEmGPHv/tlfKJyd/DkMNRkDpXVWLXYvuOFsAgfisscR4Vc1j9E=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJUwP6j+wePYlqRJbyAlqesSWXpJlSL6nmZAyrodABa5
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOCqM/sdmjxkCyxBqoBh678m/YUmxlg7+nqH3T2kFcIS+XlZpXvovQCoCbqUcEEWHO/H5Ke4hWuiSsZ6QlcCUkU=
                                             create=True mode=0644 path=/tmp/ansible.svm7xicc state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:49 compute-0 sudo[74967]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:50 compute-0 sudo[75119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwlmqnvoljxkvwnxkwrsssgbpxvsezww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993689.9862607-55-13936186952851/AnsiballZ_command.py'
Nov 24 14:14:50 compute-0 sudo[75119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:50 compute-0 python3.9[75121]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.svm7xicc' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:14:50 compute-0 sudo[75119]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:51 compute-0 sudo[75273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbovusiirzrqtqrqcvymsjgfjlxvpkbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993690.8284516-63-133370239267366/AnsiballZ_file.py'
Nov 24 14:14:51 compute-0 sudo[75273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:51 compute-0 python3.9[75275]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.svm7xicc state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:14:51 compute-0 sudo[75273]: pam_unix(sudo:session): session closed for user root
Nov 24 14:14:51 compute-0 sshd-session[74361]: Connection closed by 192.168.122.30 port 36304
Nov 24 14:14:51 compute-0 sshd-session[74358]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:14:51 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Nov 24 14:14:51 compute-0 systemd[1]: session-16.scope: Consumed 3.425s CPU time.
Nov 24 14:14:51 compute-0 systemd-logind[807]: Session 16 logged out. Waiting for processes to exit.
Nov 24 14:14:51 compute-0 systemd-logind[807]: Removed session 16.
Nov 24 14:14:53 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 14:14:57 compute-0 sshd-session[75302]: Accepted publickey for zuul from 192.168.122.30 port 36960 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:14:57 compute-0 systemd-logind[807]: New session 17 of user zuul.
Nov 24 14:14:57 compute-0 systemd[1]: Started Session 17 of User zuul.
Nov 24 14:14:57 compute-0 sshd-session[75302]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:14:58 compute-0 python3.9[75455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:14:59 compute-0 sudo[75609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adohycmbslyhmkrjkjkfvcnzwvvgtfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993698.875968-32-125447989100082/AnsiballZ_systemd.py'
Nov 24 14:14:59 compute-0 sudo[75609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:14:59 compute-0 python3.9[75611]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 24 14:14:59 compute-0 sudo[75609]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:00 compute-0 sudo[75763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsmmqrggtulbwdljfemdzvfxnwxrggzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993700.0520482-40-227581312916833/AnsiballZ_systemd.py'
Nov 24 14:15:00 compute-0 sudo[75763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:00 compute-0 python3.9[75765]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:15:00 compute-0 sudo[75763]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:01 compute-0 sudo[75916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhsqvinnjrbjrmvtzbsfqbqvdqwxrfsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993700.9860897-49-256728156548062/AnsiballZ_command.py'
Nov 24 14:15:01 compute-0 sudo[75916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:01 compute-0 python3.9[75918]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:15:01 compute-0 sudo[75916]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:02 compute-0 sudo[76069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leelslvyyykljffnsudtvixkigsfqqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993701.7950082-57-59958541465269/AnsiballZ_stat.py'
Nov 24 14:15:02 compute-0 sudo[76069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:02 compute-0 python3.9[76071]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:15:02 compute-0 sudo[76069]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:02 compute-0 sudo[76223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsupyefklddthrcnhxgevterheeauoom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993702.5845683-65-176321343769648/AnsiballZ_command.py'
Nov 24 14:15:02 compute-0 sudo[76223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:03 compute-0 python3.9[76225]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:15:03 compute-0 sudo[76223]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:03 compute-0 sudo[76378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbxsgzeuvgiupxmvisvhxwetwzvuauge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993703.308237-73-49854078570873/AnsiballZ_file.py'
Nov 24 14:15:03 compute-0 sudo[76378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:03 compute-0 python3.9[76380]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:04 compute-0 sudo[76378]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:04 compute-0 sshd-session[75305]: Connection closed by 192.168.122.30 port 36960
Nov 24 14:15:04 compute-0 sshd-session[75302]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:15:04 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Nov 24 14:15:04 compute-0 systemd[1]: session-17.scope: Consumed 4.470s CPU time.
Nov 24 14:15:04 compute-0 systemd-logind[807]: Session 17 logged out. Waiting for processes to exit.
Nov 24 14:15:04 compute-0 systemd-logind[807]: Removed session 17.
Nov 24 14:15:09 compute-0 sshd-session[76405]: Accepted publickey for zuul from 192.168.122.30 port 33200 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:15:09 compute-0 systemd-logind[807]: New session 18 of user zuul.
Nov 24 14:15:10 compute-0 systemd[1]: Started Session 18 of User zuul.
Nov 24 14:15:10 compute-0 sshd-session[76405]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:15:11 compute-0 python3.9[76558]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:15:11 compute-0 sudo[76712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzmlqsodczhnopnydzarsgeerijfzblu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993711.5639977-34-13584678502231/AnsiballZ_setup.py'
Nov 24 14:15:11 compute-0 sudo[76712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:12 compute-0 python3.9[76714]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:15:12 compute-0 sudo[76712]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:12 compute-0 sudo[76796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmfrnowwsjyduveyxshkiygnwdajbnly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993711.5639977-34-13584678502231/AnsiballZ_dnf.py'
Nov 24 14:15:12 compute-0 sudo[76796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:13 compute-0 python3.9[76798]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 24 14:15:14 compute-0 sudo[76796]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:15 compute-0 python3.9[76949]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:15:16 compute-0 python3.9[77100]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 14:15:17 compute-0 python3.9[77250]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:15:18 compute-0 python3.9[77400]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:15:18 compute-0 sshd-session[76408]: Connection closed by 192.168.122.30 port 33200
Nov 24 14:15:18 compute-0 sshd-session[76405]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:15:18 compute-0 systemd-logind[807]: Session 18 logged out. Waiting for processes to exit.
Nov 24 14:15:18 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Nov 24 14:15:18 compute-0 systemd[1]: session-18.scope: Consumed 6.545s CPU time.
Nov 24 14:15:18 compute-0 systemd-logind[807]: Removed session 18.
Nov 24 14:15:24 compute-0 sshd-session[77425]: Accepted publickey for zuul from 192.168.122.30 port 36798 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:15:24 compute-0 systemd-logind[807]: New session 19 of user zuul.
Nov 24 14:15:24 compute-0 systemd[1]: Started Session 19 of User zuul.
Nov 24 14:15:24 compute-0 sshd-session[77425]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:15:25 compute-0 python3.9[77578]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:15:26 compute-0 sudo[77732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwjwwcgtwabsapkikeqpdguzqeqhjdku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993726.0799286-50-118705296977759/AnsiballZ_file.py'
Nov 24 14:15:26 compute-0 sudo[77732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:26 compute-0 python3.9[77734]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:26 compute-0 sudo[77732]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:27 compute-0 sudo[77884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmxvyfjzikxrcqzuwxreknocnhfxwycr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993726.8614333-50-133394384182113/AnsiballZ_file.py'
Nov 24 14:15:27 compute-0 sudo[77884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:27 compute-0 python3.9[77886]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:27 compute-0 sudo[77884]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:28 compute-0 sudo[78036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brfhntjxwoszaquqisdmluirkgxafhiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993727.7491007-65-84060695772667/AnsiballZ_stat.py'
Nov 24 14:15:28 compute-0 sudo[78036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:28 compute-0 python3.9[78038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:28 compute-0 sudo[78036]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:28 compute-0 sudo[78159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylywykjbrtbxnttukxtzgawsfrpitqsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993727.7491007-65-84060695772667/AnsiballZ_copy.py'
Nov 24 14:15:28 compute-0 sudo[78159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:29 compute-0 python3.9[78161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993727.7491007-65-84060695772667/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=b105b4caf39400078f52f6bd88257f8513a9ad81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:29 compute-0 sudo[78159]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:29 compute-0 sudo[78311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-einvielhugqjfhjcdbyoybpqxmvgrsuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993729.39584-65-265699503823293/AnsiballZ_stat.py'
Nov 24 14:15:29 compute-0 sudo[78311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:29 compute-0 python3.9[78313]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:29 compute-0 sudo[78311]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:30 compute-0 sudo[78434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eltdubqonalahetjixmfkagaafutrzbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993729.39584-65-265699503823293/AnsiballZ_copy.py'
Nov 24 14:15:30 compute-0 sudo[78434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:30 compute-0 python3.9[78436]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993729.39584-65-265699503823293/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=db04c5f67f5f09b17cd3ca63095531e54a2f098c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:30 compute-0 sudo[78434]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:30 compute-0 sudo[78586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wheepbyjvjbpjwymmqzthiziryicrpoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993730.6377347-65-45730861656539/AnsiballZ_stat.py'
Nov 24 14:15:30 compute-0 sudo[78586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:31 compute-0 python3.9[78588]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:31 compute-0 sudo[78586]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:31 compute-0 sudo[78709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvgszsubglkiiylwchrnyecvpiwhlgyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993730.6377347-65-45730861656539/AnsiballZ_copy.py'
Nov 24 14:15:31 compute-0 sudo[78709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:31 compute-0 python3.9[78711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993730.6377347-65-45730861656539/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=b82dda8952277066674af7f7e1fc459ef12a5411 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:31 compute-0 sudo[78709]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:32 compute-0 sudo[78861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfrrulkxariflhojblioqfzaudofszcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993731.8568015-109-16580892686157/AnsiballZ_file.py'
Nov 24 14:15:32 compute-0 sudo[78861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:32 compute-0 python3.9[78863]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:32 compute-0 sudo[78861]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:32 compute-0 sudo[79013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxpagbrvqutwoetrgmyfsybquspzpyss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993732.538166-109-78705613251747/AnsiballZ_file.py'
Nov 24 14:15:32 compute-0 sudo[79013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:33 compute-0 python3.9[79015]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:33 compute-0 sudo[79013]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:33 compute-0 sudo[79165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjxacgxwnbrsuxcuelpsdtkqvylgdkur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993733.2115421-124-148875966075325/AnsiballZ_stat.py'
Nov 24 14:15:33 compute-0 sudo[79165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:33 compute-0 python3.9[79167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:33 compute-0 sudo[79165]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:34 compute-0 sudo[79288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssjbzfnccwndqhajdtxwfzpbdvhvxxzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993733.2115421-124-148875966075325/AnsiballZ_copy.py'
Nov 24 14:15:34 compute-0 sudo[79288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:34 compute-0 python3.9[79290]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993733.2115421-124-148875966075325/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8798584e1771ef9c5735223ea6ad370e46569ef1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:34 compute-0 sudo[79288]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:34 compute-0 sudo[79440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhsspyrmaycriyoywnufnxynhoeaofgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993734.339921-124-90320186225934/AnsiballZ_stat.py'
Nov 24 14:15:34 compute-0 sudo[79440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:34 compute-0 python3.9[79442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:34 compute-0 sudo[79440]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:35 compute-0 sudo[79563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abvubdafxxlwtfckpyrmmjfaoimhwwpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993734.339921-124-90320186225934/AnsiballZ_copy.py'
Nov 24 14:15:35 compute-0 sudo[79563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:35 compute-0 python3.9[79565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993734.339921-124-90320186225934/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=5b0cfa4900af918cb3a478fa49f2db302d6086f9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:35 compute-0 sudo[79563]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:35 compute-0 sudo[79715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfnhfikgvungatqzbjaczcjjxwoxpgpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993735.4982162-124-72882137146237/AnsiballZ_stat.py'
Nov 24 14:15:35 compute-0 sudo[79715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:35 compute-0 python3.9[79717]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:35 compute-0 sudo[79715]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:36 compute-0 sudo[79838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azyryokjvtciucikohdvcnhiudevkadj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993735.4982162-124-72882137146237/AnsiballZ_copy.py'
Nov 24 14:15:36 compute-0 sudo[79838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:36 compute-0 python3.9[79840]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993735.4982162-124-72882137146237/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=be1c19d441b0bd482ba6697d5037dc244f649e67 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:36 compute-0 sudo[79838]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:36 compute-0 sudo[79990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fuywetzzgatxuudupturxxcdrztkldho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993736.7002132-168-237431091316001/AnsiballZ_file.py'
Nov 24 14:15:36 compute-0 sudo[79990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:37 compute-0 python3.9[79992]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:37 compute-0 sudo[79990]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:37 compute-0 sudo[80142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbkftenvspeulqyijyvlkcwriefjjoxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993737.3399773-168-226855510106579/AnsiballZ_file.py'
Nov 24 14:15:37 compute-0 sudo[80142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:37 compute-0 python3.9[80144]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:37 compute-0 sudo[80142]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:38 compute-0 sudo[80294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wroopigyqpyuizgwjpozhxrizhjxwmoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993738.0310385-183-28261826334884/AnsiballZ_stat.py'
Nov 24 14:15:38 compute-0 sudo[80294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:38 compute-0 python3.9[80296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:38 compute-0 sudo[80294]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:38 compute-0 sudo[80417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npabzidoazvwnieghxwvqjiuaftjrfyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993738.0310385-183-28261826334884/AnsiballZ_copy.py'
Nov 24 14:15:38 compute-0 sudo[80417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:38 compute-0 python3.9[80419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993738.0310385-183-28261826334884/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=f52a297e7984fb78407864988e9a19ef0c2f10c7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:38 compute-0 sudo[80417]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:39 compute-0 sudo[80569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqycjusutgjvlzofluwfvwamuujmdmlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993739.1620145-183-152373669342521/AnsiballZ_stat.py'
Nov 24 14:15:39 compute-0 sudo[80569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:39 compute-0 python3.9[80571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:39 compute-0 sudo[80569]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:40 compute-0 sudo[80692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcefruatfnzelvyjahpkbfbfzcmtwjge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993739.1620145-183-152373669342521/AnsiballZ_copy.py'
Nov 24 14:15:40 compute-0 sudo[80692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:40 compute-0 python3.9[80694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993739.1620145-183-152373669342521/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c984463ccbd50fcc79b5d02c8477db4e6485b7e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:40 compute-0 sudo[80692]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:40 compute-0 sudo[80844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lduztkxxtmmkaapkrpznyokpifuofnnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993740.38323-183-57116317409588/AnsiballZ_stat.py'
Nov 24 14:15:40 compute-0 sudo[80844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:40 compute-0 python3.9[80846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:40 compute-0 sudo[80844]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:41 compute-0 sudo[80967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-celnnzfuzrxzbaggpfvioivyhmfwvohz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993740.38323-183-57116317409588/AnsiballZ_copy.py'
Nov 24 14:15:41 compute-0 sudo[80967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:41 compute-0 python3.9[80969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993740.38323-183-57116317409588/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=c607ff2174b645b1916338fe9a4c52380d4e106d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:41 compute-0 sudo[80967]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:41 compute-0 sudo[81119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmaminojuessalpauhpjytbltbzkacrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993741.689793-227-89614725632534/AnsiballZ_file.py'
Nov 24 14:15:41 compute-0 sudo[81119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:42 compute-0 python3.9[81121]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:42 compute-0 sudo[81119]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:42 compute-0 sudo[81271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txravjzyxepxjshpfojbpdsvgplhpwxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993742.282946-227-199048124721572/AnsiballZ_file.py'
Nov 24 14:15:42 compute-0 sudo[81271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:42 compute-0 python3.9[81273]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:42 compute-0 sudo[81271]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:43 compute-0 sudo[81423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilgrfukmxzkwdlqvgbfjwhdjcssquczb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993742.9704335-242-259944724247119/AnsiballZ_stat.py'
Nov 24 14:15:43 compute-0 sudo[81423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:43 compute-0 python3.9[81425]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:43 compute-0 sudo[81423]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:43 compute-0 sudo[81546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiljbjmnnzddrrlvxpxmlkpiberquqil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993742.9704335-242-259944724247119/AnsiballZ_copy.py'
Nov 24 14:15:43 compute-0 sudo[81546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:44 compute-0 python3.9[81548]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993742.9704335-242-259944724247119/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c4d8927f9aadede5e02f5260a919ffbad05862ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:44 compute-0 sudo[81546]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:44 compute-0 sudo[81698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjzbsjfpsepiaivirqbweiscrolhbwqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993744.2069206-242-64955685855316/AnsiballZ_stat.py'
Nov 24 14:15:44 compute-0 sudo[81698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:44 compute-0 python3.9[81700]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:44 compute-0 sudo[81698]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:45 compute-0 sudo[81821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwwwlirbwgfratnrliurnrktkszccoke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993744.2069206-242-64955685855316/AnsiballZ_copy.py'
Nov 24 14:15:45 compute-0 sudo[81821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:45 compute-0 python3.9[81823]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993744.2069206-242-64955685855316/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c984463ccbd50fcc79b5d02c8477db4e6485b7e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:45 compute-0 sudo[81821]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:45 compute-0 sudo[81973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzoqrmzinwczpfzraqouzdoaondgpmhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993745.405178-242-252009275266536/AnsiballZ_stat.py'
Nov 24 14:15:45 compute-0 sudo[81973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:45 compute-0 python3.9[81975]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:45 compute-0 sudo[81973]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:46 compute-0 sudo[82096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmvtucxreqopdvmstplsucgnjehhlgrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993745.405178-242-252009275266536/AnsiballZ_copy.py'
Nov 24 14:15:46 compute-0 sudo[82096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:46 compute-0 python3.9[82098]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993745.405178-242-252009275266536/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=5e6e216a65a2a677e1a7f6ea8bd2ef6506ef6012 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:46 compute-0 sudo[82096]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:47 compute-0 sudo[82248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubmocmyucjoiuskimdugejieztovrhva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993747.187108-302-146574730494290/AnsiballZ_file.py'
Nov 24 14:15:47 compute-0 sudo[82248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:47 compute-0 python3.9[82250]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:47 compute-0 sudo[82248]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:48 compute-0 sudo[82400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpicobiypdnnyewedfovqkjqiwxnpenl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993747.8510015-310-86766890997867/AnsiballZ_stat.py'
Nov 24 14:15:48 compute-0 sudo[82400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:48 compute-0 python3.9[82402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:48 compute-0 sudo[82400]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:48 compute-0 sudo[82523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhgoebaffaxhgyblcstgvvenmqhhkdsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993747.8510015-310-86766890997867/AnsiballZ_copy.py'
Nov 24 14:15:48 compute-0 sudo[82523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:48 compute-0 python3.9[82525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993747.8510015-310-86766890997867/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:48 compute-0 sudo[82523]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:49 compute-0 sudo[82675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feaegxsljntozzrscmlwhibjcksgylch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993749.183522-326-13773276804038/AnsiballZ_file.py'
Nov 24 14:15:49 compute-0 sudo[82675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:49 compute-0 python3.9[82677]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:49 compute-0 sudo[82675]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:50 compute-0 sudo[82827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubjewanbkecbbpvvxqrdyfchzphnmoux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993749.9185667-334-61997465596527/AnsiballZ_stat.py'
Nov 24 14:15:50 compute-0 sudo[82827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:50 compute-0 python3.9[82829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:50 compute-0 sudo[82827]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:50 compute-0 sudo[82950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muwevzktjsauhbyulcyxjmijdkxlioif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993749.9185667-334-61997465596527/AnsiballZ_copy.py'
Nov 24 14:15:50 compute-0 sudo[82950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:50 compute-0 python3.9[82952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993749.9185667-334-61997465596527/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:50 compute-0 sudo[82950]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:51 compute-0 sudo[83102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccngxsiitehpfphynlfbqqalzrdpnvjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993751.1564977-350-163572298521546/AnsiballZ_file.py'
Nov 24 14:15:51 compute-0 sudo[83102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:51 compute-0 python3.9[83104]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:51 compute-0 sudo[83102]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:51 compute-0 sudo[83254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plzemtusbptcktujtsgkfrparbtuogxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993751.7124326-358-61104552483135/AnsiballZ_stat.py'
Nov 24 14:15:51 compute-0 sudo[83254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:52 compute-0 python3.9[83256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:52 compute-0 sudo[83254]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:52 compute-0 sudo[83377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhrjbgimlkdcolqgganqlylxzhqermwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993751.7124326-358-61104552483135/AnsiballZ_copy.py'
Nov 24 14:15:52 compute-0 sudo[83377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:52 compute-0 python3.9[83379]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993751.7124326-358-61104552483135/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:52 compute-0 sudo[83377]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:53 compute-0 sudo[83529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olbtcmedpmfoqfilegsnovakoqsivwyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993752.9958804-374-48988198897380/AnsiballZ_file.py'
Nov 24 14:15:53 compute-0 sudo[83529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:53 compute-0 python3.9[83531]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:53 compute-0 sudo[83529]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:53 compute-0 sudo[83681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-picxgnrjfhcirklywonkdugeihypmjta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993753.6423-382-183107477096331/AnsiballZ_stat.py'
Nov 24 14:15:53 compute-0 sudo[83681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:54 compute-0 python3.9[83683]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:54 compute-0 sudo[83681]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:54 compute-0 sudo[83804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jchsjywyqvhjqpfcwwquwrwvibwvubdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993753.6423-382-183107477096331/AnsiballZ_copy.py'
Nov 24 14:15:54 compute-0 sudo[83804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:54 compute-0 python3.9[83806]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993753.6423-382-183107477096331/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:54 compute-0 sudo[83804]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:55 compute-0 sudo[83956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtnxelzcaptzsezebopgiggpoobrjefr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993754.8903215-398-157274140830830/AnsiballZ_file.py'
Nov 24 14:15:55 compute-0 sudo[83956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:55 compute-0 python3.9[83958]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:55 compute-0 sudo[83956]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:55 compute-0 sudo[84108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhqwuzsoejeinofesadbnijwthhxersc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993755.4636629-406-21822645499719/AnsiballZ_stat.py'
Nov 24 14:15:55 compute-0 sudo[84108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:55 compute-0 python3.9[84110]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:55 compute-0 sudo[84108]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:56 compute-0 sudo[84231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzhhylqecdhpkaeoztboqcxzrivhgvws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993755.4636629-406-21822645499719/AnsiballZ_copy.py'
Nov 24 14:15:56 compute-0 sudo[84231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:56 compute-0 python3.9[84233]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993755.4636629-406-21822645499719/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:56 compute-0 sudo[84231]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:57 compute-0 sudo[84383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kufpnwdewlmhhhuqjtgrdadglwwyacef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993756.7530873-422-65258263197123/AnsiballZ_file.py'
Nov 24 14:15:57 compute-0 sudo[84383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:57 compute-0 python3.9[84385]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:57 compute-0 sudo[84383]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:57 compute-0 chronyd[65141]: Selected source 216.232.132.102 (pool.ntp.org)
Nov 24 14:15:57 compute-0 sudo[84535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwudqnfuzlimvhtxexqqtcwcaklozdjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993757.4225414-430-201312938076679/AnsiballZ_stat.py'
Nov 24 14:15:57 compute-0 sudo[84535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:57 compute-0 python3.9[84537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:15:57 compute-0 sudo[84535]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:58 compute-0 sudo[84658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbxfghiutnzjuqqbsbbrlvciiozycthy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993757.4225414-430-201312938076679/AnsiballZ_copy.py'
Nov 24 14:15:58 compute-0 sudo[84658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:58 compute-0 python3.9[84660]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993757.4225414-430-201312938076679/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:15:58 compute-0 sudo[84658]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:59 compute-0 sudo[84810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyfcidwnfjoiiawghwqkztacilsuwzxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993758.7649477-446-80099596688068/AnsiballZ_file.py'
Nov 24 14:15:59 compute-0 sudo[84810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:15:59 compute-0 python3.9[84812]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:15:59 compute-0 sudo[84810]: pam_unix(sudo:session): session closed for user root
Nov 24 14:15:59 compute-0 sudo[84962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphtrilsvtrrmrzecwklfefaovquhxrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993759.5203085-454-172693330196897/AnsiballZ_stat.py'
Nov 24 14:15:59 compute-0 sudo[84962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:00 compute-0 python3.9[84964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:00 compute-0 sudo[84962]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:00 compute-0 sudo[85085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glbxuzxcymksrojcbdiegokaahckglob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993759.5203085-454-172693330196897/AnsiballZ_copy.py'
Nov 24 14:16:00 compute-0 sudo[85085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:00 compute-0 python3.9[85087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993759.5203085-454-172693330196897/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=451f751a491b9363156c1c8b1997faec65d8ee76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:00 compute-0 sudo[85085]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:00 compute-0 sshd-session[77428]: Connection closed by 192.168.122.30 port 36798
Nov 24 14:16:00 compute-0 sshd-session[77425]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:16:00 compute-0 systemd-logind[807]: Session 19 logged out. Waiting for processes to exit.
Nov 24 14:16:00 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Nov 24 14:16:00 compute-0 systemd[1]: session-19.scope: Consumed 29.048s CPU time.
Nov 24 14:16:00 compute-0 systemd-logind[807]: Removed session 19.
Nov 24 14:16:06 compute-0 sshd-session[85112]: Accepted publickey for zuul from 192.168.122.30 port 45838 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:16:06 compute-0 systemd-logind[807]: New session 20 of user zuul.
Nov 24 14:16:06 compute-0 systemd[1]: Started Session 20 of User zuul.
Nov 24 14:16:06 compute-0 sshd-session[85112]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:16:07 compute-0 python3.9[85265]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:16:08 compute-0 sudo[85419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjnsymtqkzwnrqyyhwppufrojzrypxgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993768.168679-34-147546844406182/AnsiballZ_file.py'
Nov 24 14:16:08 compute-0 sudo[85419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:08 compute-0 python3.9[85421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:08 compute-0 sudo[85419]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:09 compute-0 sudo[85571]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwwfhwhxwjlhxcrzmioxqeuivqdrlktl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993768.9597492-34-159307789826728/AnsiballZ_file.py'
Nov 24 14:16:09 compute-0 sudo[85571]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:09 compute-0 python3.9[85573]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:09 compute-0 sudo[85571]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:10 compute-0 python3.9[85723]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:16:10 compute-0 sudo[85873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daptccuvoksafdfhskvsgekdpzdhbilb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993770.3943012-57-192704235568922/AnsiballZ_seboolean.py'
Nov 24 14:16:10 compute-0 sudo[85873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:11 compute-0 python3.9[85875]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 24 14:16:12 compute-0 sudo[85873]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:12 compute-0 sudo[86029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzyficbwsgmwsdbmmtaohjdmcirodouj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993772.4366024-67-114806319694548/AnsiballZ_setup.py'
Nov 24 14:16:12 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 24 14:16:12 compute-0 sudo[86029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:12 compute-0 python3.9[86031]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:16:13 compute-0 sudo[86029]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:13 compute-0 sudo[86113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udwdaukedyuwqclkasdqwuedelcjetqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993772.4366024-67-114806319694548/AnsiballZ_dnf.py'
Nov 24 14:16:13 compute-0 sudo[86113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:13 compute-0 python3.9[86115]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:16:15 compute-0 sudo[86113]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:15 compute-0 sudo[86266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmybzrgseinfukejalezomitztqxtber ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993775.323383-79-236970925514540/AnsiballZ_systemd.py'
Nov 24 14:16:15 compute-0 sudo[86266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:16 compute-0 python3.9[86268]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:16:16 compute-0 sudo[86266]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:16 compute-0 sudo[86421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdfkzxitatgmnmtbkesprzhlgebbkmkb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763993776.4743392-87-19848337847137/AnsiballZ_edpm_nftables_snippet.py'
Nov 24 14:16:16 compute-0 sudo[86421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:17 compute-0 python3[86423]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 24 14:16:17 compute-0 sudo[86421]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:17 compute-0 sudo[86573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gebncseviodkuhyyynpmhjsdgdubrkld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993777.317778-96-248153764520711/AnsiballZ_file.py'
Nov 24 14:16:17 compute-0 sudo[86573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:17 compute-0 python3.9[86575]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:17 compute-0 sudo[86573]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:18 compute-0 sudo[86725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moxbkmlbajaikufgwgsptvyrzsfcvryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993777.985313-104-113569027919482/AnsiballZ_stat.py'
Nov 24 14:16:18 compute-0 sudo[86725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:18 compute-0 python3.9[86727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:18 compute-0 sudo[86725]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:18 compute-0 sudo[86803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reroizgnsryyqzxguqxannqfkrlwwnrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993777.985313-104-113569027919482/AnsiballZ_file.py'
Nov 24 14:16:18 compute-0 sudo[86803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:18 compute-0 python3.9[86805]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:19 compute-0 sudo[86803]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:19 compute-0 sudo[86955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evrztycgywsznfvznaprzhhysetwwaak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993779.133773-116-182887521672424/AnsiballZ_stat.py'
Nov 24 14:16:19 compute-0 sudo[86955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:19 compute-0 python3.9[86957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:19 compute-0 sudo[86955]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:19 compute-0 sudo[87033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dysxqznkguxipmsgjwbepidhbisndbub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993779.133773-116-182887521672424/AnsiballZ_file.py'
Nov 24 14:16:19 compute-0 sudo[87033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:20 compute-0 python3.9[87035]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.pdhsn4ug recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:20 compute-0 sudo[87033]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:20 compute-0 sudo[87185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rguhorszqsdrncyntwkxkpvahiddpxyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993780.1810448-128-207443448564964/AnsiballZ_stat.py'
Nov 24 14:16:20 compute-0 sudo[87185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:20 compute-0 python3.9[87187]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:20 compute-0 sudo[87185]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:20 compute-0 sudo[87263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyloslbmwydsjzutnbkdlqftxwchhivl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993780.1810448-128-207443448564964/AnsiballZ_file.py'
Nov 24 14:16:20 compute-0 sudo[87263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:21 compute-0 python3.9[87265]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:21 compute-0 sudo[87263]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:21 compute-0 sudo[87415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cuuyarhqdqxncduxhukitaayalkduyzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993781.2948627-141-131943510335372/AnsiballZ_command.py'
Nov 24 14:16:21 compute-0 sudo[87415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:21 compute-0 python3.9[87417]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:21 compute-0 sudo[87415]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:22 compute-0 sudo[87568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgqufbkusseyuublugwnnykryddmpgrd ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763993782.0761-149-5295465928857/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 14:16:22 compute-0 sudo[87568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:22 compute-0 python3[87570]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 14:16:22 compute-0 sudo[87568]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:23 compute-0 sudo[87720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyfarjvdqipochmbxrahxggknimqfort ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993782.9563015-157-69022065539270/AnsiballZ_stat.py'
Nov 24 14:16:23 compute-0 sudo[87720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:23 compute-0 python3.9[87722]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:23 compute-0 sudo[87720]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:23 compute-0 sudo[87845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irrjselrvfbyiecomgrnzlitrsbwnehj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993782.9563015-157-69022065539270/AnsiballZ_copy.py'
Nov 24 14:16:23 compute-0 sudo[87845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:24 compute-0 python3.9[87847]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993782.9563015-157-69022065539270/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:24 compute-0 sudo[87845]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:24 compute-0 sudo[87997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndznxryarvqkoekjldatbvstosuzftxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993784.3071513-172-63780353631621/AnsiballZ_stat.py'
Nov 24 14:16:24 compute-0 sudo[87997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:24 compute-0 python3.9[87999]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:24 compute-0 sudo[87997]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:25 compute-0 sudo[88122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztngnksmkegsetkrwsvgmgjsrcaqdaoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993784.3071513-172-63780353631621/AnsiballZ_copy.py'
Nov 24 14:16:25 compute-0 sudo[88122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:25 compute-0 python3.9[88124]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993784.3071513-172-63780353631621/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:25 compute-0 sudo[88122]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:25 compute-0 sudo[88274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dskbavtebvhivtsjjagbspudntoszwjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993785.5965605-187-61964561975179/AnsiballZ_stat.py'
Nov 24 14:16:25 compute-0 sudo[88274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:26 compute-0 python3.9[88276]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:26 compute-0 sudo[88274]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:26 compute-0 sudo[88399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ridazntfembjnuzwqehzfbiuztffzyvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993785.5965605-187-61964561975179/AnsiballZ_copy.py'
Nov 24 14:16:26 compute-0 sudo[88399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:26 compute-0 python3.9[88401]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993785.5965605-187-61964561975179/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:26 compute-0 sudo[88399]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:27 compute-0 sudo[88551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grflsrperqrjwvncprhmljqpmrqyyvdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993786.7825632-202-199406946317828/AnsiballZ_stat.py'
Nov 24 14:16:27 compute-0 sudo[88551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:27 compute-0 python3.9[88553]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:27 compute-0 sudo[88551]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:27 compute-0 sudo[88676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybbrnmglwldklxanrttfqnouizalobta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993786.7825632-202-199406946317828/AnsiballZ_copy.py'
Nov 24 14:16:27 compute-0 sudo[88676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:27 compute-0 python3.9[88678]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993786.7825632-202-199406946317828/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:27 compute-0 sudo[88676]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:28 compute-0 sudo[88828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uphllwbgghpodoajiolmlfipvucdbnas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993787.9737482-217-194718243750976/AnsiballZ_stat.py'
Nov 24 14:16:28 compute-0 sudo[88828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:28 compute-0 python3.9[88830]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:28 compute-0 sudo[88828]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:28 compute-0 sudo[88953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abizwlpjdqmewsjrzxyntpfpwbjcqxug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993787.9737482-217-194718243750976/AnsiballZ_copy.py'
Nov 24 14:16:28 compute-0 sudo[88953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:29 compute-0 python3.9[88955]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763993787.9737482-217-194718243750976/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:29 compute-0 sudo[88953]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:29 compute-0 sudo[89105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvgrnobcbprutgciiepvqxealnvnheim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993789.2228668-232-41287439367324/AnsiballZ_file.py'
Nov 24 14:16:29 compute-0 sudo[89105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:29 compute-0 python3.9[89107]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:29 compute-0 sudo[89105]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:30 compute-0 sudo[89257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olojmovrdjewesytckoakpwciaufwwxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993789.9843729-240-70971540754820/AnsiballZ_command.py'
Nov 24 14:16:30 compute-0 sudo[89257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:30 compute-0 python3.9[89259]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:30 compute-0 sudo[89257]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:31 compute-0 sudo[89412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xirfyvchhequaheqvlghsvyymrqqhhkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993790.6409245-248-74673035684017/AnsiballZ_blockinfile.py'
Nov 24 14:16:31 compute-0 sudo[89412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:31 compute-0 python3.9[89414]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:31 compute-0 sudo[89412]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:31 compute-0 sudo[89564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyrxjsilwhmqyebjvgdcoxdmzyorxzhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993791.5515459-257-208046187093260/AnsiballZ_command.py'
Nov 24 14:16:31 compute-0 sudo[89564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:32 compute-0 python3.9[89566]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:32 compute-0 sudo[89564]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:32 compute-0 sudo[89717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfdbdroxlwvjxbzuzxvipudebbekketn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993792.2204664-265-202404363882807/AnsiballZ_stat.py'
Nov 24 14:16:32 compute-0 sudo[89717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:32 compute-0 python3.9[89719]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:16:32 compute-0 sudo[89717]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:33 compute-0 sudo[89871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfvnbsasbydnmanhejcupyugevblsqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993792.8620334-273-494269007885/AnsiballZ_command.py'
Nov 24 14:16:33 compute-0 sudo[89871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:33 compute-0 python3.9[89873]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:33 compute-0 sudo[89871]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:33 compute-0 sudo[90027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrhbkgekihckiggpyzsxyzfnlveqocug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993793.5511003-281-280560159094592/AnsiballZ_file.py'
Nov 24 14:16:33 compute-0 sudo[90027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:34 compute-0 python3.9[90029]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:34 compute-0 sudo[90027]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:35 compute-0 python3.9[90179]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:16:35 compute-0 sudo[90331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmwhcyqonpisknlglkrzxulusrxmgyyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993795.7447045-321-15621314024718/AnsiballZ_command.py'
Nov 24 14:16:35 compute-0 sudo[90331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:36 compute-0 python3.9[90333]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:93:45:69:49" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:36 compute-0 ovs-vsctl[90334]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:93:45:69:49 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 24 14:16:36 compute-0 sudo[90331]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:36 compute-0 sudo[90484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvjegqefntnpfzoakgthkiqkxujazfwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993796.3659034-330-238320555654512/AnsiballZ_command.py'
Nov 24 14:16:36 compute-0 sudo[90484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:36 compute-0 python3.9[90486]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:36 compute-0 sudo[90484]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:37 compute-0 sudo[90639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ockfjonljcgjidfcokraybagcpixmtqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993796.9804127-338-175224807102637/AnsiballZ_command.py'
Nov 24 14:16:37 compute-0 sudo[90639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:37 compute-0 python3.9[90641]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:16:37 compute-0 ovs-vsctl[90642]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 24 14:16:37 compute-0 sudo[90639]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:38 compute-0 python3.9[90792]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:16:38 compute-0 sudo[90944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivrkmwladffpxuqyaftkcyvtztsrjlgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993798.388554-355-1931013198067/AnsiballZ_file.py'
Nov 24 14:16:38 compute-0 sudo[90944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:38 compute-0 python3.9[90946]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:38 compute-0 sudo[90944]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:39 compute-0 sudo[91096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlezibcqmxbwpqryolqrsgozajhpmzaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993798.9937992-363-82166835588045/AnsiballZ_stat.py'
Nov 24 14:16:39 compute-0 sudo[91096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:39 compute-0 python3.9[91098]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:39 compute-0 sudo[91096]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:39 compute-0 sudo[91174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvzidsjzxzxhwrlttiqtaporppeehwnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993798.9937992-363-82166835588045/AnsiballZ_file.py'
Nov 24 14:16:39 compute-0 sudo[91174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:39 compute-0 python3.9[91176]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:39 compute-0 sudo[91174]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:40 compute-0 sudo[91326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwzubpuuhwhtbljtxyfbowzwmilbpwrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993800.1099305-363-73639311675918/AnsiballZ_stat.py'
Nov 24 14:16:40 compute-0 sudo[91326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:40 compute-0 python3.9[91328]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:40 compute-0 sudo[91326]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:40 compute-0 sudo[91404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quanrqmqiysgbwloxqktvgzlhosrtaiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993800.1099305-363-73639311675918/AnsiballZ_file.py'
Nov 24 14:16:40 compute-0 sudo[91404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:41 compute-0 python3.9[91406]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:41 compute-0 sudo[91404]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:41 compute-0 sudo[91556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqikzvqlvhsnwboptdzpqjqghfxfqyum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993801.2878127-386-43933771043066/AnsiballZ_file.py'
Nov 24 14:16:41 compute-0 sudo[91556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:41 compute-0 python3.9[91558]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:41 compute-0 sudo[91556]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:42 compute-0 sudo[91708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dilzbfvwltywhdpthslnlcjvhhmjbxrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993801.9062788-394-42427646392953/AnsiballZ_stat.py'
Nov 24 14:16:42 compute-0 sudo[91708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:42 compute-0 python3.9[91710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:42 compute-0 sudo[91708]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:42 compute-0 sudo[91786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpylcezmbmoswnoptprvnctkmxstsgdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993801.9062788-394-42427646392953/AnsiballZ_file.py'
Nov 24 14:16:42 compute-0 sudo[91786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:42 compute-0 python3.9[91788]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:42 compute-0 sudo[91786]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:43 compute-0 sudo[91938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhvolxkbokikbnbzmrpgcccxxzhubrfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993803.0452335-406-117414107287229/AnsiballZ_stat.py'
Nov 24 14:16:43 compute-0 sudo[91938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:43 compute-0 python3.9[91940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:43 compute-0 sudo[91938]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:43 compute-0 sudo[92016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glxitnkeffjszwtkeuxmxwdedmfytwkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993803.0452335-406-117414107287229/AnsiballZ_file.py'
Nov 24 14:16:43 compute-0 sudo[92016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:43 compute-0 python3.9[92018]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:43 compute-0 sudo[92016]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:44 compute-0 sudo[92168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdvqxavpioaoscsxyxsymwfdbouovgxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993804.1055913-418-6913458548873/AnsiballZ_systemd.py'
Nov 24 14:16:44 compute-0 sudo[92168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:44 compute-0 python3.9[92170]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:16:44 compute-0 systemd[1]: Reloading.
Nov 24 14:16:44 compute-0 systemd-sysv-generator[92198]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:16:44 compute-0 systemd-rc-local-generator[92194]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:16:44 compute-0 sudo[92168]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:45 compute-0 sudo[92356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epxqbrssvbywhnbvgzvdccblvdmnavdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993805.1517386-426-231514768513190/AnsiballZ_stat.py'
Nov 24 14:16:45 compute-0 sudo[92356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:45 compute-0 python3.9[92358]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:45 compute-0 sudo[92356]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:45 compute-0 sudo[92434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtrcbloosdyefhbhblddtszpemukstcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993805.1517386-426-231514768513190/AnsiballZ_file.py'
Nov 24 14:16:45 compute-0 sudo[92434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:46 compute-0 python3.9[92436]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:46 compute-0 sudo[92434]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:46 compute-0 sudo[92586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgwilyuwcikyfjzxpoiynylzfidjmxkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993806.2492123-438-174001973885918/AnsiballZ_stat.py'
Nov 24 14:16:46 compute-0 sudo[92586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:46 compute-0 python3.9[92588]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:46 compute-0 sudo[92586]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:46 compute-0 sudo[92664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iffvxtjuetdmvivbawiajifxybejhovm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993806.2492123-438-174001973885918/AnsiballZ_file.py'
Nov 24 14:16:46 compute-0 sudo[92664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:47 compute-0 python3.9[92666]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:47 compute-0 sudo[92664]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:47 compute-0 sudo[92816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfzfyviwctulakdkjlkrdhochofubqpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993807.3269145-450-60851717409169/AnsiballZ_systemd.py'
Nov 24 14:16:47 compute-0 sudo[92816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:47 compute-0 python3.9[92818]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:16:47 compute-0 systemd[1]: Reloading.
Nov 24 14:16:48 compute-0 systemd-rc-local-generator[92845]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:16:48 compute-0 systemd-sysv-generator[92848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:16:48 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 14:16:48 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 14:16:48 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 14:16:48 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 14:16:48 compute-0 sudo[92816]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:48 compute-0 sudo[93010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rexydmdknfuznubtxlfhxnymuwitaurh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993808.524546-460-63020029009907/AnsiballZ_file.py'
Nov 24 14:16:48 compute-0 sudo[93010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:49 compute-0 python3.9[93012]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:49 compute-0 sudo[93010]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:49 compute-0 sudo[93162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgnsykfclgbgazcnzwrbhlsqnxinctri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993809.2057555-468-75737445629422/AnsiballZ_stat.py'
Nov 24 14:16:49 compute-0 sudo[93162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:49 compute-0 python3.9[93164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:49 compute-0 sudo[93162]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:50 compute-0 sudo[93285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaiupbfkvdffmoemmjwzaagjlulyfexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993809.2057555-468-75737445629422/AnsiballZ_copy.py'
Nov 24 14:16:50 compute-0 sudo[93285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:50 compute-0 python3.9[93287]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993809.2057555-468-75737445629422/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:50 compute-0 sudo[93285]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:50 compute-0 sudo[93437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spqvtkdtiodtvlsvxttrdubaysqlldyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993810.5881302-485-64032276118442/AnsiballZ_file.py'
Nov 24 14:16:50 compute-0 sudo[93437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:51 compute-0 python3.9[93439]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:16:51 compute-0 sudo[93437]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:51 compute-0 sudo[93589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqwxnqpyoebjczjrqmqwzvtrmxcpgarp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993811.2060769-493-163456260477579/AnsiballZ_stat.py'
Nov 24 14:16:51 compute-0 sudo[93589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:51 compute-0 python3.9[93591]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:16:51 compute-0 sudo[93589]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:51 compute-0 sudo[93712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoloyvlxxrdvvlenecwfplzusvtxzpkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993811.2060769-493-163456260477579/AnsiballZ_copy.py'
Nov 24 14:16:51 compute-0 sudo[93712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:52 compute-0 python3.9[93714]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993811.2060769-493-163456260477579/.source.json _original_basename=.688tbg0m follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:52 compute-0 sudo[93712]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:52 compute-0 sudo[93864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oclxthcqxmljyiaabdeypjzwtgzwyswf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993812.2894561-508-149970808817513/AnsiballZ_file.py'
Nov 24 14:16:52 compute-0 sudo[93864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:52 compute-0 python3.9[93866]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:52 compute-0 sudo[93864]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:53 compute-0 sudo[94016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzjlrxidgyhhmgvsxdgobkyymjhnksqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993812.903486-516-125663227944265/AnsiballZ_stat.py'
Nov 24 14:16:53 compute-0 sudo[94016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:53 compute-0 sudo[94016]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:53 compute-0 sudo[94139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgwhjbexbilntbkklxegpwauzmzcqdpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993812.903486-516-125663227944265/AnsiballZ_copy.py'
Nov 24 14:16:53 compute-0 sudo[94139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:53 compute-0 sudo[94139]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:54 compute-0 sudo[94291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekszgiykyxoaetsyoccgnaozkqncjmhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993814.2832515-533-11229222328253/AnsiballZ_container_config_data.py'
Nov 24 14:16:54 compute-0 sudo[94291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:54 compute-0 python3.9[94293]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 24 14:16:54 compute-0 sudo[94291]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:55 compute-0 sudo[94443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocoackripxvjvywmbyzfjrgjgovsextu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993815.169347-542-75814745879647/AnsiballZ_container_config_hash.py'
Nov 24 14:16:55 compute-0 sudo[94443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:55 compute-0 python3.9[94445]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:16:55 compute-0 sudo[94443]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:56 compute-0 sudo[94595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxbawegvzcmdegdndkgofbwzunxzyezz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993816.1199656-551-214905983232309/AnsiballZ_podman_container_info.py'
Nov 24 14:16:56 compute-0 sudo[94595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:56 compute-0 python3.9[94597]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 14:16:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:16:56 compute-0 sudo[94595]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:57 compute-0 sudo[94759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exfxmziiaahdmtsuusylocclgikktrbl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763993817.2492666-564-273934965564512/AnsiballZ_edpm_container_manage.py'
Nov 24 14:16:57 compute-0 sudo[94759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:58 compute-0 python3[94761]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:16:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:16:58 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:16:58 compute-0 podman[94799]: 2025-11-24 14:16:58.232981364 +0000 UTC m=+0.044951381 container create 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20251118)
Nov 24 14:16:58 compute-0 podman[94799]: 2025-11-24 14:16:58.209380774 +0000 UTC m=+0.021350821 image pull 197857ba4b35dfe0da58eb2e9c37f91c8a1d2b66c0967b4c66656aa6329b870c quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e
Nov 24 14:16:58 compute-0 python3[94761]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e
Nov 24 14:16:58 compute-0 sudo[94759]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:58 compute-0 sudo[94987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxmncunwdhquclgrkzlosieyaqnkngym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993818.5011513-572-213858334327952/AnsiballZ_stat.py'
Nov 24 14:16:58 compute-0 sudo[94987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:58 compute-0 python3.9[94989]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:16:58 compute-0 sudo[94987]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 24 14:16:59 compute-0 sudo[95141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkgdvgyongjoiomabzobipqipqurxafj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993819.1388628-581-202795893292003/AnsiballZ_file.py'
Nov 24 14:16:59 compute-0 sudo[95141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:59 compute-0 python3.9[95143]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:16:59 compute-0 sudo[95141]: pam_unix(sudo:session): session closed for user root
Nov 24 14:16:59 compute-0 sudo[95217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okkvzaeatvbqksfprmeqvkcaeigugorv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993819.1388628-581-202795893292003/AnsiballZ_stat.py'
Nov 24 14:16:59 compute-0 sudo[95217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:16:59 compute-0 python3.9[95219]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:17:00 compute-0 sudo[95217]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:00 compute-0 sudo[95368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywstzttzjapyhgkdvsqmqjgvdnmrrvgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993820.0772433-581-51056953979606/AnsiballZ_copy.py'
Nov 24 14:17:00 compute-0 sudo[95368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:00 compute-0 python3.9[95370]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763993820.0772433-581-51056953979606/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:00 compute-0 sudo[95368]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:00 compute-0 sudo[95444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzpskraahainsisuswuogtymnemarduv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993820.0772433-581-51056953979606/AnsiballZ_systemd.py'
Nov 24 14:17:00 compute-0 sudo[95444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:01 compute-0 python3.9[95446]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:17:01 compute-0 systemd[1]: Reloading.
Nov 24 14:17:01 compute-0 systemd-rc-local-generator[95472]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:01 compute-0 systemd-sysv-generator[95476]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:01 compute-0 sudo[95444]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:01 compute-0 sudo[95554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iugwsvwvkhvmbtenihmdbgpudpyrlehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993820.0772433-581-51056953979606/AnsiballZ_systemd.py'
Nov 24 14:17:01 compute-0 sudo[95554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:02 compute-0 python3.9[95556]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:17:02 compute-0 systemd[1]: Reloading.
Nov 24 14:17:02 compute-0 systemd-sysv-generator[95592]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:02 compute-0 systemd-rc-local-generator[95589]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:02 compute-0 systemd[1]: Starting ovn_controller container...
Nov 24 14:17:02 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 24 14:17:02 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:17:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cc5d2d2f1912bd784715eb14acd1dae934ab17988cebfe787417ece9ddfbaed/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 14:17:02 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff.
Nov 24 14:17:02 compute-0 podman[95597]: 2025-11-24 14:17:02.602324962 +0000 UTC m=+0.139974009 container init 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:17:02 compute-0 ovn_controller[95613]: + sudo -E kolla_set_configs
Nov 24 14:17:02 compute-0 podman[95597]: 2025-11-24 14:17:02.633260281 +0000 UTC m=+0.170909328 container start 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:17:02 compute-0 edpm-start-podman-container[95597]: ovn_controller
Nov 24 14:17:02 compute-0 systemd[1]: Created slice User Slice of UID 0.
Nov 24 14:17:02 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 24 14:17:02 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 24 14:17:02 compute-0 systemd[1]: Starting User Manager for UID 0...
Nov 24 14:17:02 compute-0 systemd[95652]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Nov 24 14:17:02 compute-0 edpm-start-podman-container[95596]: Creating additional drop-in dependency for "ovn_controller" (48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff)
Nov 24 14:17:02 compute-0 podman[95619]: 2025-11-24 14:17:02.733642525 +0000 UTC m=+0.088301958 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:17:02 compute-0 systemd[1]: 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff-35a56de3cd4de91f.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:17:02 compute-0 systemd[1]: 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff-35a56de3cd4de91f.service: Failed with result 'exit-code'.
Nov 24 14:17:02 compute-0 systemd[1]: Reloading.
Nov 24 14:17:02 compute-0 systemd[95652]: Queued start job for default target Main User Target.
Nov 24 14:17:02 compute-0 systemd[95652]: Created slice User Application Slice.
Nov 24 14:17:02 compute-0 systemd-sysv-generator[95704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:02 compute-0 systemd[95652]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 24 14:17:02 compute-0 systemd[95652]: Started Daily Cleanup of User's Temporary Directories.
Nov 24 14:17:02 compute-0 systemd[95652]: Reached target Paths.
Nov 24 14:17:02 compute-0 systemd[95652]: Reached target Timers.
Nov 24 14:17:02 compute-0 systemd-rc-local-generator[95698]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:02 compute-0 systemd[95652]: Starting D-Bus User Message Bus Socket...
Nov 24 14:17:02 compute-0 systemd[95652]: Starting Create User's Volatile Files and Directories...
Nov 24 14:17:02 compute-0 systemd[95652]: Finished Create User's Volatile Files and Directories.
Nov 24 14:17:02 compute-0 systemd[95652]: Listening on D-Bus User Message Bus Socket.
Nov 24 14:17:02 compute-0 systemd[95652]: Reached target Sockets.
Nov 24 14:17:02 compute-0 systemd[95652]: Reached target Basic System.
Nov 24 14:17:02 compute-0 systemd[95652]: Reached target Main User Target.
Nov 24 14:17:02 compute-0 systemd[95652]: Startup finished in 130ms.
Nov 24 14:17:02 compute-0 systemd[1]: Started User Manager for UID 0.
Nov 24 14:17:02 compute-0 systemd[1]: Started ovn_controller container.
Nov 24 14:17:03 compute-0 systemd[1]: Started Session c1 of User root.
Nov 24 14:17:03 compute-0 sudo[95554]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:03 compute-0 ovn_controller[95613]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:17:03 compute-0 ovn_controller[95613]: INFO:__main__:Validating config file
Nov 24 14:17:03 compute-0 ovn_controller[95613]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:17:03 compute-0 ovn_controller[95613]: INFO:__main__:Writing out command to execute
Nov 24 14:17:03 compute-0 ovn_controller[95613]: ++ cat /run_command
Nov 24 14:17:03 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + ARGS=
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + sudo kolla_copy_cacerts
Nov 24 14:17:03 compute-0 systemd[1]: Started Session c2 of User root.
Nov 24 14:17:03 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + [[ ! -n '' ]]
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + . kolla_extend_start
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 24 14:17:03 compute-0 ovn_controller[95613]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + umask 0022
Nov 24 14:17:03 compute-0 ovn_controller[95613]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1493] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1499] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1508] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1512] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1515] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 14:17:03 compute-0 kernel: br-int: entered promiscuous mode
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00024|main|INFO|OVS feature set changed, force recompute.
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 14:17:03 compute-0 ovn_controller[95613]: 2025-11-24T14:17:03Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1716] manager: (ovn-bfb4c4-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 24 14:17:03 compute-0 systemd-udevd[95771]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:17:03 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1877] device (genev_sys_6081): carrier: link connected
Nov 24 14:17:03 compute-0 NetworkManager[55697]: <info>  [1763993823.1880] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 24 14:17:03 compute-0 systemd-udevd[95772]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:17:03 compute-0 sudo[95877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjlsprrjhybtjixxenrjyftdfyrgksyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993823.1687567-609-213804472558125/AnsiballZ_command.py'
Nov 24 14:17:03 compute-0 sudo[95877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:03 compute-0 python3.9[95879]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:17:03 compute-0 ovs-vsctl[95880]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 24 14:17:03 compute-0 sudo[95877]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:04 compute-0 sudo[96030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifyqbhcycyeiuozapjcfaswypevrhopz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993823.8254976-617-234051838037887/AnsiballZ_command.py'
Nov 24 14:17:04 compute-0 sudo[96030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:04 compute-0 python3.9[96032]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:17:04 compute-0 ovs-vsctl[96034]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 24 14:17:04 compute-0 sudo[96030]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:05 compute-0 sudo[96185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qldvbnhqbiadmypkbpvuooevcoewtcxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993824.7270222-631-16262626439103/AnsiballZ_command.py'
Nov 24 14:17:05 compute-0 sudo[96185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:05 compute-0 python3.9[96187]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:17:05 compute-0 ovs-vsctl[96188]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 24 14:17:05 compute-0 sudo[96185]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:05 compute-0 sshd-session[85115]: Connection closed by 192.168.122.30 port 45838
Nov 24 14:17:05 compute-0 sshd-session[85112]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:17:05 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Nov 24 14:17:05 compute-0 systemd[1]: session-20.scope: Consumed 43.164s CPU time.
Nov 24 14:17:05 compute-0 systemd-logind[807]: Session 20 logged out. Waiting for processes to exit.
Nov 24 14:17:05 compute-0 systemd-logind[807]: Removed session 20.
Nov 24 14:17:11 compute-0 sshd-session[96214]: Accepted publickey for zuul from 192.168.122.30 port 47958 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:17:11 compute-0 systemd-logind[807]: New session 22 of user zuul.
Nov 24 14:17:11 compute-0 systemd[1]: Started Session 22 of User zuul.
Nov 24 14:17:11 compute-0 sshd-session[96214]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:17:12 compute-0 python3.9[96367]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:17:13 compute-0 systemd[1]: Stopping User Manager for UID 0...
Nov 24 14:17:13 compute-0 systemd[95652]: Activating special unit Exit the Session...
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped target Main User Target.
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped target Basic System.
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped target Paths.
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped target Sockets.
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped target Timers.
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 24 14:17:13 compute-0 systemd[95652]: Closed D-Bus User Message Bus Socket.
Nov 24 14:17:13 compute-0 systemd[95652]: Stopped Create User's Volatile Files and Directories.
Nov 24 14:17:13 compute-0 systemd[95652]: Removed slice User Application Slice.
Nov 24 14:17:13 compute-0 systemd[95652]: Reached target Shutdown.
Nov 24 14:17:13 compute-0 systemd[95652]: Finished Exit the Session.
Nov 24 14:17:13 compute-0 systemd[95652]: Reached target Exit the Session.
Nov 24 14:17:13 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Nov 24 14:17:13 compute-0 systemd[1]: Stopped User Manager for UID 0.
Nov 24 14:17:13 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 24 14:17:13 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 24 14:17:13 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 24 14:17:13 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 24 14:17:13 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Nov 24 14:17:13 compute-0 sudo[96523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riegsgcuussdfwqmtwtevmitectvbcsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993832.934511-34-83362727140502/AnsiballZ_file.py'
Nov 24 14:17:13 compute-0 sudo[96523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:13 compute-0 python3.9[96525]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:13 compute-0 sudo[96523]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:14 compute-0 sudo[96675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbnnpsowwvcrhpidonxgutiurcfglbhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993833.7044168-34-278425898832355/AnsiballZ_file.py'
Nov 24 14:17:14 compute-0 sudo[96675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:14 compute-0 python3.9[96677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:14 compute-0 sudo[96675]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:14 compute-0 sudo[96827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlegfyyznfrevoavvodgwuugfqljpzkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993834.3444085-34-147059708630404/AnsiballZ_file.py'
Nov 24 14:17:14 compute-0 sudo[96827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:14 compute-0 python3.9[96829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:14 compute-0 sudo[96827]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:15 compute-0 sudo[96979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nroxdhihdiwnkseyyzyoumzuzermvfly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993834.9513707-34-152475182350686/AnsiballZ_file.py'
Nov 24 14:17:15 compute-0 sudo[96979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:15 compute-0 python3.9[96981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:15 compute-0 sudo[96979]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:15 compute-0 sudo[97131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtlklyvwvjjkbeypidvipjrmfobyhzfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993835.557809-34-1597741055457/AnsiballZ_file.py'
Nov 24 14:17:15 compute-0 sudo[97131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:15 compute-0 python3.9[97133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:15 compute-0 sudo[97131]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:16 compute-0 python3.9[97283]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:17:17 compute-0 sudo[97433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvubttdqtvymnpspcpjfaatqnokgydn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993836.8773632-78-22154543946133/AnsiballZ_seboolean.py'
Nov 24 14:17:17 compute-0 sudo[97433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:17 compute-0 python3.9[97435]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 24 14:17:18 compute-0 sudo[97433]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:18 compute-0 python3.9[97585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:19 compute-0 python3.9[97706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993838.2756135-86-116469429313899/.source follow=False _original_basename=haproxy.j2 checksum=deae64da24ad28f71dc47276f2e9f268f19a4519 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:20 compute-0 python3.9[97856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:20 compute-0 python3.9[97977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993839.645465-101-233699180854133/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:21 compute-0 sudo[98128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jljglarcqmxfbcdzqsozgpgynxnjcano ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993840.948391-118-80776691785147/AnsiballZ_setup.py'
Nov 24 14:17:21 compute-0 sudo[98128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:21 compute-0 python3.9[98130]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:17:21 compute-0 sudo[98128]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:22 compute-0 sudo[98212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ievypollfagrdtdbiheclkkiiprpbekc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993840.948391-118-80776691785147/AnsiballZ_dnf.py'
Nov 24 14:17:22 compute-0 sudo[98212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:22 compute-0 python3.9[98214]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:17:23 compute-0 sudo[98212]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:24 compute-0 sudo[98365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jemzvgurndmrgnbwuywspimqharpcswm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993843.906175-130-246825678211721/AnsiballZ_systemd.py'
Nov 24 14:17:24 compute-0 sudo[98365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:24 compute-0 python3.9[98367]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:17:24 compute-0 sudo[98365]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:25 compute-0 python3.9[98520]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:25 compute-0 python3.9[98641]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993844.9910676-138-261341592825801/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:26 compute-0 python3.9[98791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:27 compute-0 python3.9[98912]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993846.1068163-138-108239104224416/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:28 compute-0 python3.9[99062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:28 compute-0 python3.9[99183]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993847.7972274-182-230095137357338/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:29 compute-0 python3.9[99333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:29 compute-0 python3.9[99454]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993848.888059-182-36426218208110/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:30 compute-0 python3.9[99604]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:17:31 compute-0 sudo[99756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiaofgltvzosoftkvcremclzpewhwmdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993850.7622697-220-111135558306174/AnsiballZ_file.py'
Nov 24 14:17:31 compute-0 sudo[99756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:31 compute-0 python3.9[99758]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:31 compute-0 sudo[99756]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:31 compute-0 sudo[99908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auokdydxrcyelwwvrgtyrgcdzgbudrme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993851.4178197-228-92420362470505/AnsiballZ_stat.py'
Nov 24 14:17:31 compute-0 sudo[99908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:31 compute-0 python3.9[99910]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:31 compute-0 sudo[99908]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:32 compute-0 sudo[99986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpgpbmkphvtgaaocqtgfdxepfzhxadfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993851.4178197-228-92420362470505/AnsiballZ_file.py'
Nov 24 14:17:32 compute-0 sudo[99986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:32 compute-0 python3.9[99988]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:32 compute-0 sudo[99986]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:32 compute-0 sudo[100138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpuagvjqntlcuydfmsntbvbuzlgarlgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993852.465062-228-19823637009359/AnsiballZ_stat.py'
Nov 24 14:17:32 compute-0 sudo[100138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:32 compute-0 python3.9[100140]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:32 compute-0 sudo[100138]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:33 compute-0 sudo[100226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdmaghjvvdrwqaeaiatmhcqlcxvgxilm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993852.465062-228-19823637009359/AnsiballZ_file.py'
Nov 24 14:17:33 compute-0 sudo[100226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:33 compute-0 ovn_controller[95613]: 2025-11-24T14:17:33Z|00025|memory|INFO|17152 kB peak resident set size after 30.1 seconds
Nov 24 14:17:33 compute-0 ovn_controller[95613]: 2025-11-24T14:17:33Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Nov 24 14:17:33 compute-0 podman[100190]: 2025-11-24 14:17:33.223412896 +0000 UTC m=+0.084130438 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:17:33 compute-0 python3.9[100234]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:33 compute-0 sudo[100226]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:33 compute-0 sudo[100394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aisymucrwfibzpwsrlxawvvqcaipcvsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993853.5080438-251-233760114138835/AnsiballZ_file.py'
Nov 24 14:17:33 compute-0 sudo[100394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:34 compute-0 python3.9[100396]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:34 compute-0 sudo[100394]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:34 compute-0 sudo[100546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpuawktkecxszevkbufnxjtldvmbgntv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993854.2288833-259-100521979179368/AnsiballZ_stat.py'
Nov 24 14:17:34 compute-0 sudo[100546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:34 compute-0 python3.9[100548]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:34 compute-0 sudo[100546]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:34 compute-0 sudo[100624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adgjrecnykvnivqpjpjxfqnkpfzzlvcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993854.2288833-259-100521979179368/AnsiballZ_file.py'
Nov 24 14:17:34 compute-0 sudo[100624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:35 compute-0 python3.9[100626]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:35 compute-0 sudo[100624]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:35 compute-0 sudo[100776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxmmldpovrjguhvqkzpqvmtbljrldcyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993855.2997057-271-200336056230277/AnsiballZ_stat.py'
Nov 24 14:17:35 compute-0 sudo[100776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:35 compute-0 python3.9[100778]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:35 compute-0 sudo[100776]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:36 compute-0 sudo[100854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igxnwdnjsdfrvrenzasspirhvmxfnumj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993855.2997057-271-200336056230277/AnsiballZ_file.py'
Nov 24 14:17:36 compute-0 sudo[100854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:36 compute-0 python3.9[100856]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:36 compute-0 sudo[100854]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:36 compute-0 sudo[101006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlffmabooozmdbbagdkoehfdsljygspt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993856.4336402-283-205959941679503/AnsiballZ_systemd.py'
Nov 24 14:17:36 compute-0 sudo[101006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:37 compute-0 python3.9[101008]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:17:37 compute-0 systemd[1]: Reloading.
Nov 24 14:17:37 compute-0 systemd-rc-local-generator[101034]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:37 compute-0 systemd-sysv-generator[101038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:37 compute-0 sudo[101006]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:37 compute-0 sudo[101195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymytgxdvbbuwuxmayqoelhzttjkprzla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993857.474316-291-228623250207636/AnsiballZ_stat.py'
Nov 24 14:17:37 compute-0 sudo[101195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:37 compute-0 python3.9[101197]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:37 compute-0 sudo[101195]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:38 compute-0 sudo[101273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gejjnitahkchrunlxvuruhtkvxoqgmfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993857.474316-291-228623250207636/AnsiballZ_file.py'
Nov 24 14:17:38 compute-0 sudo[101273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:38 compute-0 python3.9[101275]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:38 compute-0 sudo[101273]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:38 compute-0 sudo[101425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ersohpgbcxeeqozqtkpfoykevoymnaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993858.5679116-303-99065807799631/AnsiballZ_stat.py'
Nov 24 14:17:38 compute-0 sudo[101425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:39 compute-0 python3.9[101427]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:39 compute-0 sudo[101425]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:39 compute-0 sudo[101503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsaavllhowyjmjrnetjunambcvoxxlph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993858.5679116-303-99065807799631/AnsiballZ_file.py'
Nov 24 14:17:39 compute-0 sudo[101503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:39 compute-0 python3.9[101505]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:39 compute-0 sudo[101503]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:39 compute-0 sudo[101655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffxovrgmxwkvggdvjpyfwglmtzqvzroz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993859.629189-315-144313251090587/AnsiballZ_systemd.py'
Nov 24 14:17:39 compute-0 sudo[101655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:40 compute-0 python3.9[101657]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:17:40 compute-0 systemd[1]: Reloading.
Nov 24 14:17:40 compute-0 systemd-rc-local-generator[101685]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:40 compute-0 systemd-sysv-generator[101688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:40 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 14:17:40 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 14:17:40 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 14:17:40 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 14:17:40 compute-0 sudo[101655]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:41 compute-0 sudo[101848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apcdwmarrguhvegrzzolkhkbxacdpgij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993860.7370467-325-144233511354340/AnsiballZ_file.py'
Nov 24 14:17:41 compute-0 sudo[101848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:41 compute-0 python3.9[101850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:41 compute-0 sudo[101848]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:41 compute-0 sudo[102000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twscnghvtxjuufhykijgswgfsvxciyjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993861.3956113-333-31124779771424/AnsiballZ_stat.py'
Nov 24 14:17:41 compute-0 sudo[102000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:41 compute-0 python3.9[102002]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:41 compute-0 sudo[102000]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:42 compute-0 sudo[102123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpgkixrlyqvuolrpwxduoizsdexxpxmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993861.3956113-333-31124779771424/AnsiballZ_copy.py'
Nov 24 14:17:42 compute-0 sudo[102123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:42 compute-0 python3.9[102125]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763993861.3956113-333-31124779771424/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:42 compute-0 sudo[102123]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:42 compute-0 sudo[102275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzgjifiheodessrnipyqdxrjvppssccs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993862.6906834-350-56933530249956/AnsiballZ_file.py'
Nov 24 14:17:42 compute-0 sudo[102275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:43 compute-0 python3.9[102277]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:17:43 compute-0 sudo[102275]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:43 compute-0 sudo[102427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acktmycshzpcoukxdzjjqflsypvkvzxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993863.3152485-358-172490110479560/AnsiballZ_stat.py'
Nov 24 14:17:43 compute-0 sudo[102427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:43 compute-0 python3.9[102429]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:17:43 compute-0 sudo[102427]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:44 compute-0 sudo[102550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saqmsmmsgfjxwobsscqzuoxxmgvjemwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993863.3152485-358-172490110479560/AnsiballZ_copy.py'
Nov 24 14:17:44 compute-0 sudo[102550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:44 compute-0 python3.9[102552]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763993863.3152485-358-172490110479560/.source.json _original_basename=.1eii0_p1 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:44 compute-0 sudo[102550]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:44 compute-0 sudo[102702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpiguqzdkanjrbhozfkdtilqfjffryfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993864.4158628-373-206040878158863/AnsiballZ_file.py'
Nov 24 14:17:44 compute-0 sudo[102702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:44 compute-0 python3.9[102704]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:44 compute-0 sudo[102702]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:45 compute-0 sudo[102854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxhyzhtjfyutgkvfrbligqdrghnctexu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993865.1437852-381-243020868244888/AnsiballZ_stat.py'
Nov 24 14:17:45 compute-0 sudo[102854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:45 compute-0 sudo[102854]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:45 compute-0 sudo[102977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chkkrxktgwcmkmtfopzehxncwwzqdxjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993865.1437852-381-243020868244888/AnsiballZ_copy.py'
Nov 24 14:17:45 compute-0 sudo[102977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:46 compute-0 sudo[102977]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:46 compute-0 sudo[103129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnhiextejzlltqmnoksvrunpzfwbqiie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993866.503448-398-265135450747302/AnsiballZ_container_config_data.py'
Nov 24 14:17:46 compute-0 sudo[103129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:47 compute-0 python3.9[103131]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 24 14:17:47 compute-0 sudo[103129]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:47 compute-0 sudo[103281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrclgyuwowbppqftxrsnsayjyqynpswq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993867.328426-407-103827068241/AnsiballZ_container_config_hash.py'
Nov 24 14:17:47 compute-0 sudo[103281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:47 compute-0 python3.9[103283]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:17:47 compute-0 sudo[103281]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:48 compute-0 sudo[103433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzeobkxnonwdthjyvcgbhhjumftssccq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993868.1293132-416-146494434622515/AnsiballZ_podman_container_info.py'
Nov 24 14:17:48 compute-0 sudo[103433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:48 compute-0 python3.9[103435]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 14:17:48 compute-0 sudo[103433]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:49 compute-0 sudo[103611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzixrshriwhqtymbhttlegklojhyyvvu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763993869.2602463-429-44780444613673/AnsiballZ_edpm_container_manage.py'
Nov 24 14:17:49 compute-0 sudo[103611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:49 compute-0 python3[103613]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:17:50 compute-0 podman[103651]: 2025-11-24 14:17:50.131417261 +0000 UTC m=+0.046496292 container create 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS)
Nov 24 14:17:50 compute-0 podman[103651]: 2025-11-24 14:17:50.106724881 +0000 UTC m=+0.021803932 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:17:50 compute-0 python3[103613]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:17:50 compute-0 sudo[103611]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:50 compute-0 sudo[103837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpljztjzcswkllhgojskkiulyddilkxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993870.4148915-437-2314981220484/AnsiballZ_stat.py'
Nov 24 14:17:50 compute-0 sudo[103837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:50 compute-0 python3.9[103839]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:17:50 compute-0 sudo[103837]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:51 compute-0 sudo[103991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdmnikewwchomrpndqlaienxzeqeaggo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993871.1264899-446-78528688373287/AnsiballZ_file.py'
Nov 24 14:17:51 compute-0 sudo[103991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:51 compute-0 python3.9[103993]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:51 compute-0 sudo[103991]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:51 compute-0 sudo[104067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xofwmyzttxigratkdtfmzmnelepzuohc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993871.1264899-446-78528688373287/AnsiballZ_stat.py'
Nov 24 14:17:51 compute-0 sudo[104067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:52 compute-0 python3.9[104069]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:17:52 compute-0 sudo[104067]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:52 compute-0 sudo[104218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqfakzxxhlepgoekpvofhylktshryvkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993872.1261454-446-209750050728994/AnsiballZ_copy.py'
Nov 24 14:17:52 compute-0 sudo[104218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:52 compute-0 python3.9[104220]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763993872.1261454-446-209750050728994/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:17:52 compute-0 sudo[104218]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:53 compute-0 sudo[104294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aktepkdbsnrxzwirveyajhfhdhrwaujz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993872.1261454-446-209750050728994/AnsiballZ_systemd.py'
Nov 24 14:17:53 compute-0 sudo[104294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:53 compute-0 python3.9[104296]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:17:53 compute-0 systemd[1]: Reloading.
Nov 24 14:17:53 compute-0 systemd-rc-local-generator[104321]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:53 compute-0 systemd-sysv-generator[104326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:53 compute-0 sudo[104294]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:53 compute-0 sudo[104406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psoixqmkbjaqqlbkibjtdmqvsotxgkut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993872.1261454-446-209750050728994/AnsiballZ_systemd.py'
Nov 24 14:17:53 compute-0 sudo[104406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:17:54 compute-0 python3.9[104408]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:17:54 compute-0 systemd[1]: Reloading.
Nov 24 14:17:54 compute-0 systemd-rc-local-generator[104435]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:54 compute-0 systemd-sysv-generator[104439]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:54 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Nov 24 14:17:54 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:17:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fdc215ab32866820b607be0d13ccc99961fac336ba142c789e9ecec9ba145e6/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 24 14:17:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fdc215ab32866820b607be0d13ccc99961fac336ba142c789e9ecec9ba145e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:17:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4.
Nov 24 14:17:54 compute-0 podman[104449]: 2025-11-24 14:17:54.708254926 +0000 UTC m=+0.150940104 container init 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + sudo -E kolla_set_configs
Nov 24 14:17:54 compute-0 podman[104449]: 2025-11-24 14:17:54.734797934 +0000 UTC m=+0.177483132 container start 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:17:54 compute-0 edpm-start-podman-container[104449]: ovn_metadata_agent
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Validating config file
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Copying service configuration files
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Writing out command to execute
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: ++ cat /run_command
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + CMD=neutron-ovn-metadata-agent
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + ARGS=
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + sudo kolla_copy_cacerts
Nov 24 14:17:54 compute-0 edpm-start-podman-container[104448]: Creating additional drop-in dependency for "ovn_metadata_agent" (765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4)
Nov 24 14:17:54 compute-0 podman[104471]: 2025-11-24 14:17:54.81568527 +0000 UTC m=+0.064527421 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + [[ ! -n '' ]]
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + . kolla_extend_start
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: Running command: 'neutron-ovn-metadata-agent'
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + umask 0022
Nov 24 14:17:54 compute-0 ovn_metadata_agent[104464]: + exec neutron-ovn-metadata-agent
Nov 24 14:17:54 compute-0 systemd[1]: Reloading.
Nov 24 14:17:54 compute-0 systemd-sysv-generator[104544]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:17:54 compute-0 systemd-rc-local-generator[104537]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:17:55 compute-0 systemd[1]: Started ovn_metadata_agent container.
Nov 24 14:17:55 compute-0 sudo[104406]: pam_unix(sudo:session): session closed for user root
Nov 24 14:17:55 compute-0 sshd-session[96217]: Connection closed by 192.168.122.30 port 47958
Nov 24 14:17:55 compute-0 sshd-session[96214]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:17:55 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Nov 24 14:17:55 compute-0 systemd[1]: session-22.scope: Consumed 32.398s CPU time.
Nov 24 14:17:55 compute-0 systemd-logind[807]: Session 22 logged out. Waiting for processes to exit.
Nov 24 14:17:55 compute-0 systemd-logind[807]: Removed session 22.
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.605 104469 INFO neutron.common.config [-] Logging enabled!
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.606 104469 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.606 104469 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.606 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.606 104469 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.607 104469 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.608 104469 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.609 104469 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.610 104469 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.611 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.612 104469 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.613 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.614 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.615 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.616 104469 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.617 104469 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.618 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.619 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.620 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.621 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.622 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.623 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.624 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.625 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.626 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.627 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.628 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.629 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.630 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.631 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.632 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.633 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.634 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.635 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.636 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.637 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.638 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.639 104469 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.640 104469 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.648 104469 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.648 104469 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.648 104469 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.649 104469 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.649 104469 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.660 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name dfd2f9fd-c9ed-4d16-a231-48176f986586 (UUID: dfd2f9fd-c9ed-4d16-a231-48176f986586) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.683 104469 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.683 104469 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.683 104469 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.683 104469 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.686 104469 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.691 104469 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.696 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'dfd2f9fd-c9ed-4d16-a231-48176f986586'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], external_ids={}, name=dfd2f9fd-c9ed-4d16-a231-48176f986586, nb_cfg_timestamp=1763993831176, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.697 104469 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f3e5660b160>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.698 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.698 104469 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.698 104469 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.698 104469 INFO oslo_service.service [-] Starting 1 workers
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.702 104469 DEBUG oslo_service.service [-] Started child 104579 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.705 104469 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpstnmvzww/privsep.sock']
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.706 104579 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-954174'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.727 104579 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.728 104579 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.728 104579 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.731 104579 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.737 104579 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 24 14:17:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:56.743 104579 INFO eventlet.wsgi.server [-] (104579) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Nov 24 14:17:57 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.357 104469 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.358 104469 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpstnmvzww/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.225 104584 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.230 104584 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.234 104584 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.235 104584 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104584
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.360 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[adee701c-0985-4b24-adea-d40eed16771a]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.890 104584 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.890 104584 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:17:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:57.890 104584 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.441 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a2268e36-635d-4974-ab1c-08ee8651db15]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.443 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, column=external_ids, values=({'neutron:ovn-metadata-id': '6190cc76-712a-5201-9af2-0e90d0b01b24'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.452 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.457 104469 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.457 104469 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.457 104469 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.457 104469 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.458 104469 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.459 104469 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.460 104469 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.461 104469 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.462 104469 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.463 104469 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.464 104469 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.465 104469 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.466 104469 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.467 104469 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.468 104469 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.469 104469 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.470 104469 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.471 104469 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.472 104469 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.473 104469 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.474 104469 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.475 104469 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.476 104469 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.477 104469 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.477 104469 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.477 104469 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.477 104469 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.477 104469 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.477 104469 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.478 104469 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.479 104469 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.480 104469 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.481 104469 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.482 104469 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.483 104469 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.484 104469 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.485 104469 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.486 104469 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.487 104469 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.487 104469 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.487 104469 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.487 104469 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.487 104469 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.487 104469 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.488 104469 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.489 104469 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.490 104469 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.491 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.492 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.493 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.494 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.495 104469 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:17:58 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:17:58.496 104469 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 14:18:00 compute-0 sshd-session[104589]: Accepted publickey for zuul from 192.168.122.30 port 44458 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:18:00 compute-0 systemd-logind[807]: New session 23 of user zuul.
Nov 24 14:18:00 compute-0 systemd[1]: Started Session 23 of User zuul.
Nov 24 14:18:00 compute-0 sshd-session[104589]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:18:01 compute-0 python3.9[104742]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:18:02 compute-0 sudo[104896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbbesqcnvsjhuurxybdaqdiwhtzrdbsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993882.1973226-34-231068346137762/AnsiballZ_command.py'
Nov 24 14:18:02 compute-0 sudo[104896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:02 compute-0 python3.9[104898]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:02 compute-0 sudo[104896]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:03 compute-0 podman[104988]: 2025-11-24 14:18:03.471444724 +0000 UTC m=+0.080146339 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:18:03 compute-0 sudo[105090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdsaxwsoxbaprrbvjpivjvbxhsoyxrby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993883.1939354-45-173953222377255/AnsiballZ_systemd_service.py'
Nov 24 14:18:03 compute-0 sudo[105090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:04 compute-0 python3.9[105092]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:18:04 compute-0 systemd[1]: Reloading.
Nov 24 14:18:04 compute-0 systemd-rc-local-generator[105119]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:18:04 compute-0 systemd-sysv-generator[105123]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:18:04 compute-0 sudo[105090]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:05 compute-0 python3.9[105277]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:18:05 compute-0 network[105294]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:18:05 compute-0 network[105295]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:18:05 compute-0 network[105296]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:18:08 compute-0 sudo[105555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfgomqacoztbrvnztxxphurpxizxobxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993887.773689-64-117262902730866/AnsiballZ_systemd_service.py'
Nov 24 14:18:08 compute-0 sudo[105555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:08 compute-0 python3.9[105557]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:08 compute-0 sudo[105555]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:08 compute-0 sudo[105708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwwpahampcpkfegpnkycozfpxwychzfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993888.4314513-64-33217740736981/AnsiballZ_systemd_service.py'
Nov 24 14:18:08 compute-0 sudo[105708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:08 compute-0 python3.9[105710]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:09 compute-0 sudo[105708]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:09 compute-0 sudo[105861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqzrajjlipasjmwzqczmtflfwqcswakm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993889.148368-64-223354857676663/AnsiballZ_systemd_service.py'
Nov 24 14:18:09 compute-0 sudo[105861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:10 compute-0 python3.9[105863]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:10 compute-0 sudo[105861]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:10 compute-0 sudo[106014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zihwjrnhigjhznircqqjexbylgnbxhip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993890.3349779-64-184480299119096/AnsiballZ_systemd_service.py'
Nov 24 14:18:10 compute-0 sudo[106014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:10 compute-0 python3.9[106016]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:10 compute-0 sudo[106014]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:11 compute-0 sudo[106167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jualnzizxtwyophhjwjcuklwloekizgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993891.0659986-64-71886766287474/AnsiballZ_systemd_service.py'
Nov 24 14:18:11 compute-0 sudo[106167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:11 compute-0 python3.9[106169]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:11 compute-0 sudo[106167]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:12 compute-0 sudo[106320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytkytqvraeauzbpzhsdcsfrlvklqlzbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993891.8183286-64-247368554853033/AnsiballZ_systemd_service.py'
Nov 24 14:18:12 compute-0 sudo[106320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:12 compute-0 python3.9[106322]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:12 compute-0 sudo[106320]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:12 compute-0 sudo[106473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loymzjjwzcitvdhjuzhaaiphlkijvsfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993892.5344067-64-118430163857752/AnsiballZ_systemd_service.py'
Nov 24 14:18:12 compute-0 sudo[106473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:13 compute-0 python3.9[106475]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:18:13 compute-0 sudo[106473]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:13 compute-0 sudo[106626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrslggthyqfiwyzknpvsowqaxpyhvsxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993893.418475-116-101323500507257/AnsiballZ_file.py'
Nov 24 14:18:13 compute-0 sudo[106626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:14 compute-0 python3.9[106628]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:14 compute-0 sudo[106626]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:14 compute-0 sudo[106778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooibvjlwgewgkfttwffhlfpghywenprw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993894.124803-116-263523394213554/AnsiballZ_file.py'
Nov 24 14:18:14 compute-0 sudo[106778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:14 compute-0 python3.9[106780]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:14 compute-0 sudo[106778]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:14 compute-0 sudo[106930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxpnlrisottowrlsvifzmvrkozeuokag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993894.6872163-116-9180686950976/AnsiballZ_file.py'
Nov 24 14:18:14 compute-0 sudo[106930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:15 compute-0 python3.9[106932]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:15 compute-0 sudo[106930]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:15 compute-0 sudo[107082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyyyffotgwjfcpashltgnhhwzxwatoil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993895.2476435-116-36650456455757/AnsiballZ_file.py'
Nov 24 14:18:15 compute-0 sudo[107082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:15 compute-0 python3.9[107084]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:15 compute-0 sudo[107082]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:16 compute-0 sudo[107234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cncwgvjaezfiomnrjoilcwyuxufzohfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993895.8075278-116-153737558026754/AnsiballZ_file.py'
Nov 24 14:18:16 compute-0 sudo[107234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:16 compute-0 python3.9[107236]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:16 compute-0 sudo[107234]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:16 compute-0 sudo[107386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sojuvripnuofviituirqdselecwbmlpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993896.3509243-116-180293757229982/AnsiballZ_file.py'
Nov 24 14:18:16 compute-0 sudo[107386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:16 compute-0 python3.9[107388]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:16 compute-0 sudo[107386]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:17 compute-0 sudo[107538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntkshlnwlzwpblmhfqqqowlxvxyohewl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993896.9648852-116-50233147326056/AnsiballZ_file.py'
Nov 24 14:18:17 compute-0 sudo[107538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:17 compute-0 python3.9[107540]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:17 compute-0 sudo[107538]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:17 compute-0 sudo[107690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqvmbpfamaqqyvytbfydolnlyhvlsaye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993897.6130042-166-210896860240308/AnsiballZ_file.py'
Nov 24 14:18:17 compute-0 sudo[107690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:18 compute-0 python3.9[107692]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:18 compute-0 sudo[107690]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:18 compute-0 sudo[107842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsqiqzynpqhikntgjeljmpztmnqupczh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993898.3297749-166-115903874873924/AnsiballZ_file.py'
Nov 24 14:18:18 compute-0 sudo[107842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:18 compute-0 python3.9[107844]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:18 compute-0 sudo[107842]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:19 compute-0 sudo[107994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cljhnmsxxqcwotwjcuxfmrhrmwzaapaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993898.9297478-166-86852140600137/AnsiballZ_file.py'
Nov 24 14:18:19 compute-0 sudo[107994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:19 compute-0 python3.9[107996]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:19 compute-0 sudo[107994]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:19 compute-0 sudo[108146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytdqrlpolcojjhtedwulnkpmqdiscfnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993899.5039752-166-53169937775247/AnsiballZ_file.py'
Nov 24 14:18:19 compute-0 sudo[108146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:19 compute-0 python3.9[108148]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:19 compute-0 sudo[108146]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:20 compute-0 sudo[108298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzzkfhtncbvdowfxsftneizjstpcnlxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993900.0441864-166-194613019039279/AnsiballZ_file.py'
Nov 24 14:18:20 compute-0 sudo[108298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:20 compute-0 python3.9[108300]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:20 compute-0 sudo[108298]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:20 compute-0 sudo[108450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkjtwvducqdjzmetofyacgwsugezfskd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993900.6719918-166-175686532341262/AnsiballZ_file.py'
Nov 24 14:18:20 compute-0 sudo[108450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:21 compute-0 python3.9[108452]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:21 compute-0 sudo[108450]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:21 compute-0 sudo[108602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwfrdzmajzfvssmlkubehunouydjiupb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993901.2886362-166-180788544689159/AnsiballZ_file.py'
Nov 24 14:18:21 compute-0 sudo[108602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:21 compute-0 python3.9[108604]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:18:21 compute-0 sudo[108602]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:22 compute-0 sudo[108754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cinzrzgpmgluacpsjvypkbhyisneebhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993901.9116151-217-175098200796826/AnsiballZ_command.py'
Nov 24 14:18:22 compute-0 sudo[108754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:22 compute-0 python3.9[108756]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:22 compute-0 sudo[108754]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:23 compute-0 python3.9[108908]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 14:18:23 compute-0 sudo[109058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wynrbmjzjswxyirdryuuaungqoqrrwha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993903.2894666-235-109039109440146/AnsiballZ_systemd_service.py'
Nov 24 14:18:23 compute-0 sudo[109058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:23 compute-0 python3.9[109060]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:18:23 compute-0 systemd[1]: Reloading.
Nov 24 14:18:23 compute-0 systemd-rc-local-generator[109087]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:18:23 compute-0 systemd-sysv-generator[109092]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:18:24 compute-0 sudo[109058]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:24 compute-0 sudo[109245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnvgdelvkkusljirtdjsujdgzvxgpvoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993904.1947784-243-110370574786888/AnsiballZ_command.py'
Nov 24 14:18:24 compute-0 sudo[109245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:24 compute-0 python3.9[109247]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:24 compute-0 sudo[109245]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:25 compute-0 sudo[109408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbjvdabylxxjahtaukyfbudrddljjdkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993904.7736948-243-257593506078291/AnsiballZ_command.py'
Nov 24 14:18:25 compute-0 sudo[109408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:25 compute-0 podman[109372]: 2025-11-24 14:18:25.038695037 +0000 UTC m=+0.052493258 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 24 14:18:25 compute-0 python3.9[109419]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:25 compute-0 sudo[109408]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:25 compute-0 sudo[109570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vquhldjyysljemnzdzsntowgtyfcadqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993905.3397355-243-176156225907846/AnsiballZ_command.py'
Nov 24 14:18:25 compute-0 sudo[109570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:25 compute-0 python3.9[109572]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:25 compute-0 sudo[109570]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:26 compute-0 sudo[109723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khxtiuvgafahbnoeburbtxrbgcrjobai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993905.8991826-243-42222276335752/AnsiballZ_command.py'
Nov 24 14:18:26 compute-0 sudo[109723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:26 compute-0 python3.9[109725]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:26 compute-0 sudo[109723]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:26 compute-0 sudo[109876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkkzzkeznmlaswwcxgzexrgavmlbryer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993906.464664-243-163887632018306/AnsiballZ_command.py'
Nov 24 14:18:26 compute-0 sudo[109876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:26 compute-0 python3.9[109878]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:26 compute-0 sudo[109876]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:27 compute-0 sudo[110029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnwnplkcowhrltfxqrjmnzusoaurafap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993907.041471-243-14285220128557/AnsiballZ_command.py'
Nov 24 14:18:27 compute-0 sudo[110029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:27 compute-0 python3.9[110031]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:27 compute-0 sudo[110029]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:27 compute-0 sudo[110182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwjcggdaydhtbxcbmiaeglfrsxqgvjmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993907.6698852-243-45644320345242/AnsiballZ_command.py'
Nov 24 14:18:27 compute-0 sudo[110182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:28 compute-0 python3.9[110184]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:18:28 compute-0 sudo[110182]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:28 compute-0 sudo[110335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nscflsbhvfigiektrkkekjvuubzdyhid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993908.547879-297-137472588660760/AnsiballZ_getent.py'
Nov 24 14:18:28 compute-0 sudo[110335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:29 compute-0 python3.9[110337]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 24 14:18:29 compute-0 sudo[110335]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:29 compute-0 sudo[110488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysyalgyaodrfjiozuzwkgdhdzfumzizv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993909.3127978-305-63420198140189/AnsiballZ_group.py'
Nov 24 14:18:29 compute-0 sudo[110488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:29 compute-0 python3.9[110490]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 14:18:29 compute-0 groupadd[110491]: group added to /etc/group: name=libvirt, GID=42473
Nov 24 14:18:29 compute-0 groupadd[110491]: group added to /etc/gshadow: name=libvirt
Nov 24 14:18:30 compute-0 groupadd[110491]: new group: name=libvirt, GID=42473
Nov 24 14:18:30 compute-0 sudo[110488]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:30 compute-0 sudo[110646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tztbxawgjqehdkqdyqhzcuannnjzvsuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993910.2307162-313-25202056921355/AnsiballZ_user.py'
Nov 24 14:18:30 compute-0 sudo[110646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:30 compute-0 python3.9[110648]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 14:18:30 compute-0 useradd[110650]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 14:18:30 compute-0 sudo[110646]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:31 compute-0 sudo[110806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdoioueeomxcwrxdwtnqzrqefmivqtiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993911.2673151-324-121245576926507/AnsiballZ_setup.py'
Nov 24 14:18:31 compute-0 sudo[110806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:31 compute-0 python3.9[110808]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:18:32 compute-0 sudo[110806]: pam_unix(sudo:session): session closed for user root
Nov 24 14:18:32 compute-0 sudo[110890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsczzogcqwrklpcafafinrzecjyzvvjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763993911.2673151-324-121245576926507/AnsiballZ_dnf.py'
Nov 24 14:18:32 compute-0 sudo[110890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:18:32 compute-0 python3.9[110892]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:18:34 compute-0 podman[110896]: 2025-11-24 14:18:34.452991013 +0000 UTC m=+0.067565868 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:18:55 compute-0 podman[111107]: 2025-11-24 14:18:55.440529791 +0000 UTC m=+0.050515176 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 24 14:18:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:18:56.642 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:18:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:18:56.643 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:18:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:18:56.643 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:18:58 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 14:18:58 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 14:19:05 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Nov 24 14:19:05 compute-0 podman[111133]: 2025-11-24 14:19:05.479458041 +0000 UTC m=+0.083452787 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:19:08 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 14:19:08 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 14:19:26 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 24 14:19:26 compute-0 podman[114770]: 2025-11-24 14:19:26.472595935 +0000 UTC m=+0.074062510 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:19:36 compute-0 podman[121603]: 2025-11-24 14:19:36.462482228 +0000 UTC m=+0.077162560 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 24 14:19:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:19:56.644 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:19:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:19:56.645 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:19:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:19:56.645 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:19:56 compute-0 podman[128008]: 2025-11-24 14:19:56.727745123 +0000 UTC m=+0.053782027 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:19:57 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 24 14:19:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 24 14:19:58 compute-0 groupadd[128038]: group added to /etc/group: name=dnsmasq, GID=992
Nov 24 14:19:58 compute-0 groupadd[128038]: group added to /etc/gshadow: name=dnsmasq
Nov 24 14:19:58 compute-0 groupadd[128038]: new group: name=dnsmasq, GID=992
Nov 24 14:19:58 compute-0 useradd[128045]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Nov 24 14:19:58 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:19:58 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 24 14:19:58 compute-0 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Nov 24 14:19:59 compute-0 groupadd[128058]: group added to /etc/group: name=clevis, GID=991
Nov 24 14:19:59 compute-0 groupadd[128058]: group added to /etc/gshadow: name=clevis
Nov 24 14:19:59 compute-0 groupadd[128058]: new group: name=clevis, GID=991
Nov 24 14:19:59 compute-0 useradd[128065]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Nov 24 14:19:59 compute-0 usermod[128075]: add 'clevis' to group 'tss'
Nov 24 14:19:59 compute-0 usermod[128075]: add 'clevis' to shadow group 'tss'
Nov 24 14:20:01 compute-0 polkitd[43722]: Reloading rules
Nov 24 14:20:01 compute-0 polkitd[43722]: Collecting garbage unconditionally...
Nov 24 14:20:01 compute-0 polkitd[43722]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 14:20:01 compute-0 polkitd[43722]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 14:20:01 compute-0 polkitd[43722]: Finished loading, compiling and executing 3 rules
Nov 24 14:20:01 compute-0 polkitd[43722]: Reloading rules
Nov 24 14:20:01 compute-0 polkitd[43722]: Collecting garbage unconditionally...
Nov 24 14:20:01 compute-0 polkitd[43722]: Loading rules from directory /etc/polkit-1/rules.d
Nov 24 14:20:01 compute-0 polkitd[43722]: Loading rules from directory /usr/share/polkit-1/rules.d
Nov 24 14:20:01 compute-0 polkitd[43722]: Finished loading, compiling and executing 3 rules
Nov 24 14:20:02 compute-0 groupadd[128263]: group added to /etc/group: name=ceph, GID=167
Nov 24 14:20:02 compute-0 groupadd[128263]: group added to /etc/gshadow: name=ceph
Nov 24 14:20:02 compute-0 groupadd[128263]: new group: name=ceph, GID=167
Nov 24 14:20:02 compute-0 useradd[128269]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Nov 24 14:20:05 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Nov 24 14:20:05 compute-0 sshd[1006]: Received signal 15; terminating.
Nov 24 14:20:05 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Nov 24 14:20:05 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Nov 24 14:20:05 compute-0 systemd[1]: sshd.service: Consumed 1.308s CPU time, read 32.0K from disk, written 8.0K to disk.
Nov 24 14:20:05 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Nov 24 14:20:05 compute-0 systemd[1]: Stopping sshd-keygen.target...
Nov 24 14:20:05 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 14:20:05 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 14:20:05 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 24 14:20:05 compute-0 systemd[1]: Reached target sshd-keygen.target.
Nov 24 14:20:05 compute-0 systemd[1]: Starting OpenSSH server daemon...
Nov 24 14:20:05 compute-0 sshd[128788]: Server listening on 0.0.0.0 port 22.
Nov 24 14:20:05 compute-0 sshd[128788]: Server listening on :: port 22.
Nov 24 14:20:05 compute-0 systemd[1]: Started OpenSSH server daemon.
Nov 24 14:20:06 compute-0 podman[128939]: 2025-11-24 14:20:06.607227761 +0000 UTC m=+0.100497570 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 24 14:20:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:20:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:20:07 compute-0 systemd[1]: Reloading.
Nov 24 14:20:07 compute-0 systemd-rc-local-generator[129072]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:07 compute-0 systemd-sysv-generator[129075]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:07 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:20:10 compute-0 sudo[110890]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:10 compute-0 sudo[132984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxowohcgwymiikcrdsqfjpcjeysvvrqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994010.1738636-336-93036554239890/AnsiballZ_systemd.py'
Nov 24 14:20:10 compute-0 sudo[132984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:11 compute-0 python3.9[133013]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:20:11 compute-0 systemd[1]: Reloading.
Nov 24 14:20:11 compute-0 systemd-rc-local-generator[133477]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:11 compute-0 systemd-sysv-generator[133482]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:11 compute-0 sudo[132984]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:11 compute-0 sudo[134270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmmtyjzuygxgmckqlzognghsmkaidpeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994011.5411332-336-54174933942103/AnsiballZ_systemd.py'
Nov 24 14:20:11 compute-0 sudo[134270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:12 compute-0 python3.9[134284]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:20:12 compute-0 systemd[1]: Reloading.
Nov 24 14:20:12 compute-0 systemd-rc-local-generator[134737]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:12 compute-0 systemd-sysv-generator[134740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:12 compute-0 sudo[134270]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:12 compute-0 sudo[135533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcuiktuhvquioyrlqoxnzuiifagprhoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994012.4871445-336-244566896634475/AnsiballZ_systemd.py'
Nov 24 14:20:12 compute-0 sudo[135533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:13 compute-0 python3.9[135555]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:20:13 compute-0 systemd[1]: Reloading.
Nov 24 14:20:13 compute-0 systemd-rc-local-generator[135994]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:13 compute-0 systemd-sysv-generator[135997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:13 compute-0 sudo[135533]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:13 compute-0 sudo[136859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzbiygdevvsmaftnkaceutzleycnxhuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994013.4686959-336-262932262335861/AnsiballZ_systemd.py'
Nov 24 14:20:13 compute-0 sudo[136859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:14 compute-0 python3.9[136878]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:20:14 compute-0 systemd[1]: Reloading.
Nov 24 14:20:14 compute-0 systemd-rc-local-generator[137352]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:14 compute-0 systemd-sysv-generator[137355]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:14 compute-0 sudo[136859]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:14 compute-0 sudo[138150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyvokgdxrunrzozwsvjwzpjwmldkaovx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994014.4821646-365-13623479689284/AnsiballZ_systemd.py'
Nov 24 14:20:14 compute-0 sudo[138150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:15 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:20:15 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:20:15 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.794s CPU time.
Nov 24 14:20:15 compute-0 systemd[1]: run-r645558764a8a45208356a0a37beb3c88.service: Deactivated successfully.
Nov 24 14:20:15 compute-0 python3.9[138172]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:15 compute-0 systemd[1]: Reloading.
Nov 24 14:20:15 compute-0 systemd-sysv-generator[138395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:15 compute-0 systemd-rc-local-generator[138392]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:15 compute-0 sudo[138150]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:15 compute-0 sudo[138549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvchqrbseofxduenpghihnuihpriimed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994015.4756098-365-190745179929253/AnsiballZ_systemd.py'
Nov 24 14:20:15 compute-0 sudo[138549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:16 compute-0 python3.9[138551]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:16 compute-0 systemd[1]: Reloading.
Nov 24 14:20:16 compute-0 systemd-rc-local-generator[138582]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:16 compute-0 systemd-sysv-generator[138585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:16 compute-0 sudo[138549]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:16 compute-0 sudo[138739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzeoaxjozuzkfoftpqomgunqckkolour ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994016.4327686-365-91182126385463/AnsiballZ_systemd.py'
Nov 24 14:20:16 compute-0 sudo[138739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:16 compute-0 python3.9[138741]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:17 compute-0 systemd[1]: Reloading.
Nov 24 14:20:17 compute-0 systemd-rc-local-generator[138769]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:17 compute-0 systemd-sysv-generator[138774]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:17 compute-0 sudo[138739]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:17 compute-0 sudo[138929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctyocbwufanulyzdvqphhlxlotdwnvkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994017.4320724-365-238218409691999/AnsiballZ_systemd.py'
Nov 24 14:20:17 compute-0 sudo[138929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:17 compute-0 python3.9[138931]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:18 compute-0 sudo[138929]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:18 compute-0 sudo[139084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykhshpfzswntcnijlwcjjuvixejaestj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994018.1883745-365-1803583183923/AnsiballZ_systemd.py'
Nov 24 14:20:18 compute-0 sudo[139084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:18 compute-0 python3.9[139086]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:18 compute-0 systemd[1]: Reloading.
Nov 24 14:20:18 compute-0 systemd-rc-local-generator[139115]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:18 compute-0 systemd-sysv-generator[139118]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:19 compute-0 sudo[139084]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:19 compute-0 sudo[139273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqgqkhiiyqxbuojedgdphgahmjlujpnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994019.2358577-401-137575124794475/AnsiballZ_systemd.py'
Nov 24 14:20:19 compute-0 sudo[139273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:19 compute-0 python3.9[139275]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 24 14:20:19 compute-0 systemd[1]: Reloading.
Nov 24 14:20:19 compute-0 systemd-rc-local-generator[139305]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:20:19 compute-0 systemd-sysv-generator[139308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:20:20 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 24 14:20:20 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 24 14:20:20 compute-0 sudo[139273]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:20 compute-0 sudo[139465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etrwoodigksyuvtfmhvbonjulnvzlimv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994020.2654078-409-10477074747805/AnsiballZ_systemd.py'
Nov 24 14:20:20 compute-0 sudo[139465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:20 compute-0 python3.9[139467]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:20 compute-0 sudo[139465]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:21 compute-0 sudo[139620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vayzhwdbpwvhplkjxspwfabthpgztafa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994021.0271275-409-268235576296490/AnsiballZ_systemd.py'
Nov 24 14:20:21 compute-0 sudo[139620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:21 compute-0 python3.9[139622]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:21 compute-0 sudo[139620]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:21 compute-0 sudo[139775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzwrngvnmyaryxfyttljllqhazfklryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994021.7610633-409-145808376600082/AnsiballZ_systemd.py'
Nov 24 14:20:21 compute-0 sudo[139775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:22 compute-0 python3.9[139777]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:22 compute-0 sudo[139775]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:22 compute-0 sudo[139930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkhpzrmhszwchgqvttadenktyenspoya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994022.465056-409-235170181406986/AnsiballZ_systemd.py'
Nov 24 14:20:22 compute-0 sudo[139930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:23 compute-0 python3.9[139932]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:23 compute-0 sudo[139930]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:23 compute-0 sudo[140085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpwshmeioeqjclxmjcnddnuiyvulppls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994023.205204-409-54240325258611/AnsiballZ_systemd.py'
Nov 24 14:20:23 compute-0 sudo[140085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:23 compute-0 python3.9[140087]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:23 compute-0 sudo[140085]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:24 compute-0 sudo[140240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bypacuhblxktsbkirvfzjyzhpbhfmkwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994023.9501722-409-224517379958620/AnsiballZ_systemd.py'
Nov 24 14:20:24 compute-0 sudo[140240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:24 compute-0 python3.9[140242]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:24 compute-0 sudo[140240]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:25 compute-0 sudo[140395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhxbllfkpqyyjhuappvecqpwawyzcoxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994024.697782-409-7024632487216/AnsiballZ_systemd.py'
Nov 24 14:20:25 compute-0 sudo[140395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:25 compute-0 python3.9[140397]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:25 compute-0 sudo[140395]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:25 compute-0 sudo[140550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljcywwlvtbgueogmjeqpyfcvryqggeek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994025.623279-409-149823511931707/AnsiballZ_systemd.py'
Nov 24 14:20:25 compute-0 sudo[140550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:26 compute-0 python3.9[140552]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:26 compute-0 sudo[140550]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:26 compute-0 sudo[140705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oixhcdaeiyauhdhpwpczfzdraqleiehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994026.3636448-409-141460807676548/AnsiballZ_systemd.py'
Nov 24 14:20:26 compute-0 sudo[140705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:26 compute-0 python3.9[140707]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:27 compute-0 sudo[140705]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:27 compute-0 podman[140709]: 2025-11-24 14:20:27.025661505 +0000 UTC m=+0.056261683 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true)
Nov 24 14:20:27 compute-0 sudo[140878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjllaayudymauwkafhctjopwzqhckiqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994027.1391609-409-184446941880218/AnsiballZ_systemd.py'
Nov 24 14:20:27 compute-0 sudo[140878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:27 compute-0 python3.9[140880]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:27 compute-0 sudo[140878]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:28 compute-0 sudo[141033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gounvsqntwmzkqkyeqsnykdkzlfnohuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994027.8899734-409-146576683187985/AnsiballZ_systemd.py'
Nov 24 14:20:28 compute-0 sudo[141033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:28 compute-0 python3.9[141035]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:28 compute-0 sudo[141033]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:28 compute-0 sudo[141188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjrknqqlhdqanozjxayiohznhgveless ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994028.662726-409-1075113618462/AnsiballZ_systemd.py'
Nov 24 14:20:28 compute-0 sudo[141188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:29 compute-0 python3.9[141190]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:29 compute-0 sudo[141188]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:29 compute-0 sudo[141343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaseqhzhqpmyzkdeqiinifbrfpedsntn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994029.4200184-409-279155311851857/AnsiballZ_systemd.py'
Nov 24 14:20:29 compute-0 sudo[141343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:29 compute-0 python3.9[141345]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:30 compute-0 sudo[141343]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:30 compute-0 sudo[141498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clevroretszxxcqzxyqeksfcptcesmec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994030.242372-409-26911651030036/AnsiballZ_systemd.py'
Nov 24 14:20:30 compute-0 sudo[141498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:30 compute-0 python3.9[141500]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 24 14:20:31 compute-0 sudo[141498]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:31 compute-0 sudo[141653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaugwvyjkphcmiqdwtfyxrtzbzdmahoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994031.3415205-511-257593496146583/AnsiballZ_file.py'
Nov 24 14:20:31 compute-0 sudo[141653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:31 compute-0 python3.9[141655]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:20:31 compute-0 sudo[141653]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:32 compute-0 sudo[141805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmvadgqpszqkkvsnjhgeaurgcifwseej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994031.9890769-511-53194178487980/AnsiballZ_file.py'
Nov 24 14:20:32 compute-0 sudo[141805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:32 compute-0 python3.9[141807]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:20:32 compute-0 sudo[141805]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:32 compute-0 sudo[141957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhehqwkxtttrallzljmanytpeannzecw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994032.638383-511-272783608462207/AnsiballZ_file.py'
Nov 24 14:20:32 compute-0 sudo[141957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:33 compute-0 python3.9[141959]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:20:33 compute-0 sudo[141957]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:33 compute-0 sudo[142109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwscsgturxmttoppvhdlkwjjcqzzkjot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994033.2727427-511-6896570050211/AnsiballZ_file.py'
Nov 24 14:20:33 compute-0 sudo[142109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:33 compute-0 python3.9[142111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:20:33 compute-0 sudo[142109]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:34 compute-0 sudo[142261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udaeobwpfqwkyvnuyrnoqlflvtyhqcgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994033.839706-511-162374784691360/AnsiballZ_file.py'
Nov 24 14:20:34 compute-0 sudo[142261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:34 compute-0 python3.9[142263]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:20:34 compute-0 sudo[142261]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:34 compute-0 sudo[142413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agzumvqiuenvlhmuebsmaxwcxozilzdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994034.437967-511-208417412710891/AnsiballZ_file.py'
Nov 24 14:20:34 compute-0 sudo[142413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:34 compute-0 python3.9[142415]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:20:34 compute-0 sudo[142413]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:35 compute-0 sudo[142565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqbscbrrzrbmxzbygqwruxjhqkeknjzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994035.068505-554-176508365894818/AnsiballZ_stat.py'
Nov 24 14:20:35 compute-0 sudo[142565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:35 compute-0 python3.9[142567]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:35 compute-0 sudo[142565]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:36 compute-0 sudo[142690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqbwsqsszrmgrwafwlbbfebkyiwiclvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994035.068505-554-176508365894818/AnsiballZ_copy.py'
Nov 24 14:20:36 compute-0 sudo[142690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:36 compute-0 python3.9[142692]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994035.068505-554-176508365894818/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:36 compute-0 sudo[142690]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:36 compute-0 sudo[142853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqzejocfymeaalqgrwuejqynkahytpdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994036.4956667-554-228637781052901/AnsiballZ_stat.py'
Nov 24 14:20:36 compute-0 sudo[142853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:36 compute-0 podman[142816]: 2025-11-24 14:20:36.769545167 +0000 UTC m=+0.072536885 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 24 14:20:36 compute-0 python3.9[142861]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:36 compute-0 sudo[142853]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:37 compute-0 sudo[142994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dboopnetzdfpbsbpohmevxgwctfdfktg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994036.4956667-554-228637781052901/AnsiballZ_copy.py'
Nov 24 14:20:37 compute-0 sudo[142994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:37 compute-0 python3.9[142996]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994036.4956667-554-228637781052901/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:37 compute-0 sudo[142994]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:37 compute-0 sudo[143146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwktfqablunnxljmskklncvhdzwhkplp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994037.5893126-554-90042975456711/AnsiballZ_stat.py'
Nov 24 14:20:37 compute-0 sudo[143146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:38 compute-0 python3.9[143148]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:38 compute-0 sudo[143146]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:38 compute-0 sudo[143271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omxgfbqzmlfxqkpzudojhtbbmuankemr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994037.5893126-554-90042975456711/AnsiballZ_copy.py'
Nov 24 14:20:38 compute-0 sudo[143271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:38 compute-0 python3.9[143273]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994037.5893126-554-90042975456711/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:38 compute-0 sudo[143271]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:38 compute-0 sudo[143423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qybdisttqhtkfgshwipgthhsxibgskmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994038.6959047-554-78170840104925/AnsiballZ_stat.py'
Nov 24 14:20:38 compute-0 sudo[143423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:39 compute-0 python3.9[143425]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:39 compute-0 sudo[143423]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:39 compute-0 sudo[143548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovwmqqjuscxksjcrxkdftclvzsmjziru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994038.6959047-554-78170840104925/AnsiballZ_copy.py'
Nov 24 14:20:39 compute-0 sudo[143548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:39 compute-0 python3.9[143550]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994038.6959047-554-78170840104925/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:39 compute-0 sudo[143548]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:40 compute-0 sudo[143700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bodawgobdsafdeypulbzsyvefbxssval ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994039.8345022-554-43604901681450/AnsiballZ_stat.py'
Nov 24 14:20:40 compute-0 sudo[143700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:40 compute-0 python3.9[143702]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:40 compute-0 sudo[143700]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:40 compute-0 sudo[143825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nphnhdfjfwukwwdpzawvrxsrywnjacfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994039.8345022-554-43604901681450/AnsiballZ_copy.py'
Nov 24 14:20:40 compute-0 sudo[143825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:40 compute-0 python3.9[143827]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994039.8345022-554-43604901681450/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:40 compute-0 sudo[143825]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:41 compute-0 sudo[143977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzqzujnkarelqhrzwotukntjdkudvzlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994041.0445163-554-62587254572225/AnsiballZ_stat.py'
Nov 24 14:20:41 compute-0 sudo[143977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:41 compute-0 python3.9[143979]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:41 compute-0 sudo[143977]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:41 compute-0 sudo[144102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyhpxtlnqboyaltjqyxfyvmspznmagxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994041.0445163-554-62587254572225/AnsiballZ_copy.py'
Nov 24 14:20:41 compute-0 sudo[144102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:42 compute-0 python3.9[144104]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994041.0445163-554-62587254572225/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:42 compute-0 sudo[144102]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:42 compute-0 sudo[144254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbjvemegvchookziklkqxbohazjbezil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994042.3272178-554-172357202879891/AnsiballZ_stat.py'
Nov 24 14:20:42 compute-0 sudo[144254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:42 compute-0 python3.9[144256]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:42 compute-0 sudo[144254]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:43 compute-0 sudo[144377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smjogsaekjnvsjfmcstwmhusxartqbou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994042.3272178-554-172357202879891/AnsiballZ_copy.py'
Nov 24 14:20:43 compute-0 sudo[144377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:43 compute-0 python3.9[144379]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994042.3272178-554-172357202879891/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:43 compute-0 sudo[144377]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:43 compute-0 sudo[144529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aurfqzddjxovhrdaexasizjqriazwgjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994043.429173-554-90068637475235/AnsiballZ_stat.py'
Nov 24 14:20:43 compute-0 sudo[144529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:43 compute-0 python3.9[144531]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:43 compute-0 sudo[144529]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:44 compute-0 sudo[144654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgujcbnzcvxsjcfhuiuzitbxuntmsjmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994043.429173-554-90068637475235/AnsiballZ_copy.py'
Nov 24 14:20:44 compute-0 sudo[144654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:44 compute-0 python3.9[144656]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1763994043.429173-554-90068637475235/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:44 compute-0 sudo[144654]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:44 compute-0 sudo[144806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slqvbsuxjwwzpqrbwexoqvlbzcokoytk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994044.5313072-667-93641338116453/AnsiballZ_command.py'
Nov 24 14:20:44 compute-0 sudo[144806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:44 compute-0 python3.9[144808]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 24 14:20:44 compute-0 sudo[144806]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:45 compute-0 sudo[144959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxapowsckqhclyygwboznwrzfobnxroi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994045.174023-676-112511829940458/AnsiballZ_file.py'
Nov 24 14:20:45 compute-0 sudo[144959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:45 compute-0 python3.9[144961]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:45 compute-0 sudo[144959]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:46 compute-0 sudo[145111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuffsuxypiuuarwpbdqprjwsrbhqdoqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994045.7825377-676-169677402476781/AnsiballZ_file.py'
Nov 24 14:20:46 compute-0 sudo[145111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:46 compute-0 python3.9[145113]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:46 compute-0 sudo[145111]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:46 compute-0 sudo[145263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qizszfrifzxiafzpqmejapdihqnwedty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994046.425393-676-179579262012821/AnsiballZ_file.py'
Nov 24 14:20:46 compute-0 sudo[145263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:46 compute-0 python3.9[145265]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:46 compute-0 sudo[145263]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:47 compute-0 sudo[145415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrjtujlfiwatdimkdlqwiltoxyqorxxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994047.007027-676-265153506861655/AnsiballZ_file.py'
Nov 24 14:20:47 compute-0 sudo[145415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:47 compute-0 python3.9[145417]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:47 compute-0 sudo[145415]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:47 compute-0 sudo[145567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twtqynecmudwxpbekvrkfugdzslnvlim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994047.6523108-676-181484201985497/AnsiballZ_file.py'
Nov 24 14:20:47 compute-0 sudo[145567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:48 compute-0 python3.9[145569]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:48 compute-0 sudo[145567]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:48 compute-0 sudo[145719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bveyevlebnksfqivvekmuuluxdbnanua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994048.181101-676-113842419408418/AnsiballZ_file.py'
Nov 24 14:20:48 compute-0 sudo[145719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:48 compute-0 python3.9[145721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:48 compute-0 sudo[145719]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:49 compute-0 sudo[145871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yswcjkqphqmpucegsbueizytkpypdsqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994048.7277963-676-177746937604679/AnsiballZ_file.py'
Nov 24 14:20:49 compute-0 sudo[145871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:49 compute-0 python3.9[145873]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:49 compute-0 sudo[145871]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:49 compute-0 sudo[146023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qckqiiimzhqjvyjlmvlphbiyyggmcrvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994049.3598943-676-205976316769744/AnsiballZ_file.py'
Nov 24 14:20:49 compute-0 sudo[146023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:49 compute-0 python3.9[146025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:49 compute-0 sudo[146023]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:50 compute-0 sudo[146175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zshxifohrybnsjnaxaqjvnjakyeafvde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994049.9855435-676-45250568419084/AnsiballZ_file.py'
Nov 24 14:20:50 compute-0 sudo[146175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:50 compute-0 python3.9[146177]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:50 compute-0 sudo[146175]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:50 compute-0 sudo[146327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glqrpendvxconledpuulutfrvnaehvua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994050.6211784-676-34019362521765/AnsiballZ_file.py'
Nov 24 14:20:50 compute-0 sudo[146327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:51 compute-0 python3.9[146329]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:51 compute-0 sudo[146327]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:51 compute-0 sudo[146479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yigwpikpsqsqmmbajltzxgzizeasqavj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994051.230148-676-86577805206549/AnsiballZ_file.py'
Nov 24 14:20:51 compute-0 sudo[146479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:51 compute-0 python3.9[146481]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:51 compute-0 sudo[146479]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:52 compute-0 sudo[146631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezhcxjgwobbfbukwhmkongmlrhobrsrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994051.7749352-676-8610068465247/AnsiballZ_file.py'
Nov 24 14:20:52 compute-0 sudo[146631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:52 compute-0 python3.9[146633]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:52 compute-0 sudo[146631]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:52 compute-0 sudo[146783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqmmpinxdxqamcdgrczrzohstwiablvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994052.3619094-676-228603117393505/AnsiballZ_file.py'
Nov 24 14:20:52 compute-0 sudo[146783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:52 compute-0 python3.9[146785]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:52 compute-0 sudo[146783]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:53 compute-0 sudo[146935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgtjwsgszozwyylzjvivsmnqnpvavfhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994052.9151506-676-137242718834625/AnsiballZ_file.py'
Nov 24 14:20:53 compute-0 sudo[146935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:53 compute-0 python3.9[146937]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:53 compute-0 sudo[146935]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:53 compute-0 sudo[147087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edhsmuybzxflsekbqztscjedcflalfih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994053.579986-775-19250958855200/AnsiballZ_stat.py'
Nov 24 14:20:53 compute-0 sudo[147087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:54 compute-0 python3.9[147089]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:54 compute-0 sudo[147087]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:54 compute-0 sudo[147210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nuurkyyehgrauhojinppdqmgkvnuhzbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994053.579986-775-19250958855200/AnsiballZ_copy.py'
Nov 24 14:20:54 compute-0 sudo[147210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:54 compute-0 python3.9[147212]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994053.579986-775-19250958855200/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:54 compute-0 sudo[147210]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:54 compute-0 sudo[147362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xaxjnjyydqiglccabytsxfgevlcfwity ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994054.6739938-775-256350969644256/AnsiballZ_stat.py'
Nov 24 14:20:54 compute-0 sudo[147362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:55 compute-0 python3.9[147364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:55 compute-0 sudo[147362]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:55 compute-0 sudo[147485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcxfogdtdmavqxmfoaflwilvhxhkmhrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994054.6739938-775-256350969644256/AnsiballZ_copy.py'
Nov 24 14:20:55 compute-0 sudo[147485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:55 compute-0 python3.9[147487]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994054.6739938-775-256350969644256/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:55 compute-0 sudo[147485]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:56 compute-0 sudo[147637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktgketzdhfsyhnvpzxvfmunikpckexxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994055.7357743-775-54293643073874/AnsiballZ_stat.py'
Nov 24 14:20:56 compute-0 sudo[147637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:56 compute-0 python3.9[147639]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:56 compute-0 sudo[147637]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:56 compute-0 sudo[147760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpcseauoiagngfogzukwwyjetojxbozz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994055.7357743-775-54293643073874/AnsiballZ_copy.py'
Nov 24 14:20:56 compute-0 sudo[147760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:20:56.646 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:20:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:20:56.646 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:20:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:20:56.647 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:20:56 compute-0 python3.9[147762]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994055.7357743-775-54293643073874/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:56 compute-0 sudo[147760]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:57 compute-0 sudo[147924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlqbjfuzhbtxzihcojamxjlqssfoohch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994056.880313-775-239746056819746/AnsiballZ_stat.py'
Nov 24 14:20:57 compute-0 sudo[147924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:57 compute-0 podman[147886]: 2025-11-24 14:20:57.169534682 +0000 UTC m=+0.048147419 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, config_id=ovn_metadata_agent)
Nov 24 14:20:57 compute-0 python3.9[147934]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:57 compute-0 sudo[147924]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:57 compute-0 sudo[148055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgzfmxjjhmajdgrwhbyeockgdtfqajwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994056.880313-775-239746056819746/AnsiballZ_copy.py'
Nov 24 14:20:57 compute-0 sudo[148055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:57 compute-0 python3.9[148057]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994056.880313-775-239746056819746/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:57 compute-0 sudo[148055]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:58 compute-0 sudo[148207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cndcdyhmadvmgpqaqxxldeqavhwiwfch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994058.012794-775-61240522947731/AnsiballZ_stat.py'
Nov 24 14:20:58 compute-0 sudo[148207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:58 compute-0 python3.9[148209]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:20:58 compute-0 sudo[148207]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:59 compute-0 sudo[148330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faqxrbjhqsbdtfkphqfbtgexcrijjrgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994058.012794-775-61240522947731/AnsiballZ_copy.py'
Nov 24 14:20:59 compute-0 sudo[148330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:20:59 compute-0 python3.9[148332]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994058.012794-775-61240522947731/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:20:59 compute-0 sudo[148330]: pam_unix(sudo:session): session closed for user root
Nov 24 14:20:59 compute-0 sudo[148482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqhrywtjvmdkyexftjnpylaumqimrwem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994059.4717593-775-247106244496579/AnsiballZ_stat.py'
Nov 24 14:20:59 compute-0 sudo[148482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:00 compute-0 python3.9[148484]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:00 compute-0 sudo[148482]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:00 compute-0 sudo[148605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqolskkhiumkpbpbzintxgsrxspzfljr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994059.4717593-775-247106244496579/AnsiballZ_copy.py'
Nov 24 14:21:00 compute-0 sudo[148605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:00 compute-0 python3.9[148607]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994059.4717593-775-247106244496579/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:00 compute-0 sudo[148605]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:01 compute-0 sudo[148757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfnliulckgiaskjacxobhdmvwdifslrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994060.8063502-775-157592704011553/AnsiballZ_stat.py'
Nov 24 14:21:01 compute-0 sudo[148757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:01 compute-0 python3.9[148759]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:01 compute-0 sudo[148757]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:01 compute-0 sudo[148880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qovgqmrvrtzarlvgqdmminrtvocyeays ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994060.8063502-775-157592704011553/AnsiballZ_copy.py'
Nov 24 14:21:01 compute-0 sudo[148880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:01 compute-0 python3.9[148882]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994060.8063502-775-157592704011553/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:01 compute-0 sudo[148880]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:02 compute-0 sudo[149032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmjhdidltejximivorfobmgnrgkrjxuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994062.1146822-775-259397635961020/AnsiballZ_stat.py'
Nov 24 14:21:02 compute-0 sudo[149032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:02 compute-0 python3.9[149034]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:02 compute-0 sudo[149032]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:02 compute-0 sudo[149155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcwrkqwzgtucfctcjpchvlzplqwlzywq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994062.1146822-775-259397635961020/AnsiballZ_copy.py'
Nov 24 14:21:02 compute-0 sudo[149155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:03 compute-0 python3.9[149157]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994062.1146822-775-259397635961020/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:03 compute-0 sudo[149155]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:03 compute-0 sudo[149307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmntffuiwadpnlylglkwivpdzyyqiubx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994063.3051457-775-66177943552756/AnsiballZ_stat.py'
Nov 24 14:21:03 compute-0 sudo[149307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:03 compute-0 python3.9[149309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:03 compute-0 sudo[149307]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:04 compute-0 sudo[149430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrmcvbvyamzjqbtyplymcjuefbtlsagv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994063.3051457-775-66177943552756/AnsiballZ_copy.py'
Nov 24 14:21:04 compute-0 sudo[149430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:04 compute-0 python3.9[149432]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994063.3051457-775-66177943552756/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:04 compute-0 sudo[149430]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:04 compute-0 sudo[149582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hriswmmkptnonzdhdpfrcpzltqpmjmqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994064.612752-775-173028514048861/AnsiballZ_stat.py'
Nov 24 14:21:04 compute-0 sudo[149582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:05 compute-0 python3.9[149584]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:05 compute-0 sudo[149582]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:05 compute-0 sudo[149705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txfpptetzuqtvjnzhfiporfydjqwaxhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994064.612752-775-173028514048861/AnsiballZ_copy.py'
Nov 24 14:21:05 compute-0 sudo[149705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:05 compute-0 python3.9[149707]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994064.612752-775-173028514048861/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:05 compute-0 sudo[149705]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:06 compute-0 sudo[149857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hecptvclzanobxoxlzbossmdnazenqmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994065.737434-775-204440521804204/AnsiballZ_stat.py'
Nov 24 14:21:06 compute-0 sudo[149857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:06 compute-0 python3.9[149859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:06 compute-0 sudo[149857]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:06 compute-0 sudo[149980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmcwrucbvnybrbwnuykuzyyrelzfwomh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994065.737434-775-204440521804204/AnsiballZ_copy.py'
Nov 24 14:21:06 compute-0 sudo[149980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:06 compute-0 python3.9[149982]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994065.737434-775-204440521804204/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:06 compute-0 sudo[149980]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:07 compute-0 sudo[150152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhmkobedxxvwzuflquotuehnvqtuexzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994067.057408-775-200951252092772/AnsiballZ_stat.py'
Nov 24 14:21:07 compute-0 sudo[150152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:07 compute-0 podman[150106]: 2025-11-24 14:21:07.430762132 +0000 UTC m=+0.100020050 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118)
Nov 24 14:21:07 compute-0 python3.9[150157]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:07 compute-0 sudo[150152]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:07 compute-0 sudo[150281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cagqrogoscfmwxesmtamkibnofksgymp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994067.057408-775-200951252092772/AnsiballZ_copy.py'
Nov 24 14:21:07 compute-0 sudo[150281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:08 compute-0 python3.9[150283]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994067.057408-775-200951252092772/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:08 compute-0 sudo[150281]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:08 compute-0 sudo[150433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tturscfpwehllltocakznmewnfxpsuks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994068.3387096-775-265213984089124/AnsiballZ_stat.py'
Nov 24 14:21:08 compute-0 sudo[150433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:08 compute-0 python3.9[150435]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:08 compute-0 sudo[150433]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:09 compute-0 sudo[150556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdyfkqsohnrhahslmindrbyrqvleredz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994068.3387096-775-265213984089124/AnsiballZ_copy.py'
Nov 24 14:21:09 compute-0 sudo[150556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:10 compute-0 python3.9[150558]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994068.3387096-775-265213984089124/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:10 compute-0 sudo[150556]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:11 compute-0 sudo[150708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zopyrmeknkzndwloucoixmfsmwwcdhwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994070.7477257-775-117652147193942/AnsiballZ_stat.py'
Nov 24 14:21:11 compute-0 sudo[150708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:11 compute-0 python3.9[150710]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:11 compute-0 sudo[150708]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:11 compute-0 sudo[150831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsjdjwhgvixpqiobazethiefaorezpej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994070.7477257-775-117652147193942/AnsiballZ_copy.py'
Nov 24 14:21:11 compute-0 sudo[150831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:11 compute-0 python3.9[150833]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994070.7477257-775-117652147193942/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:11 compute-0 sudo[150831]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:12 compute-0 python3.9[150983]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:21:13 compute-0 sudo[151136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxbypvzqprhyapyrdzjiciyhabxipakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994072.6731327-981-265967497094328/AnsiballZ_seboolean.py'
Nov 24 14:21:13 compute-0 sudo[151136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:13 compute-0 python3.9[151138]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 24 14:21:14 compute-0 sudo[151136]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:14 compute-0 sudo[151292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ireniewhikfrjcpkmaeqnoepwcwllbab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994074.6386228-989-152578035321047/AnsiballZ_copy.py'
Nov 24 14:21:14 compute-0 dbus-broker-launch[773]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 24 14:21:14 compute-0 sudo[151292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:15 compute-0 python3.9[151294]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:15 compute-0 sudo[151292]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:15 compute-0 sudo[151444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdhyenqhiqtgqqxhrqhrrwhfciizjdqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994075.2975554-989-197843311229881/AnsiballZ_copy.py'
Nov 24 14:21:15 compute-0 sudo[151444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:15 compute-0 python3.9[151446]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:15 compute-0 sudo[151444]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:16 compute-0 sudo[151596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myehuvdnzqgbsruzugovjjxuqrsurcku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994075.9453058-989-268867890835882/AnsiballZ_copy.py'
Nov 24 14:21:16 compute-0 sudo[151596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:16 compute-0 python3.9[151598]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:16 compute-0 sudo[151596]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:16 compute-0 sudo[151748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmcnvoxveuwnpxngxdgugxyrtdnochlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994076.5360858-989-105767232801338/AnsiballZ_copy.py'
Nov 24 14:21:16 compute-0 sudo[151748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:16 compute-0 python3.9[151750]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:16 compute-0 sudo[151748]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:17 compute-0 sudo[151900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwtuwmpmgfqneuatvxudwcjyewezofld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994077.0995033-989-172485988072167/AnsiballZ_copy.py'
Nov 24 14:21:17 compute-0 sudo[151900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:17 compute-0 python3.9[151902]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:17 compute-0 sudo[151900]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:17 compute-0 sudo[152052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mopykuhrzsdgbhwjnitfcvzygjekrvxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994077.6989567-1025-222551514518515/AnsiballZ_copy.py'
Nov 24 14:21:17 compute-0 sudo[152052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:18 compute-0 python3.9[152054]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:18 compute-0 sudo[152052]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:18 compute-0 sudo[152204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfjftmthenulfwkqbgqjgmhedwzctvds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994078.2509296-1025-6793238565991/AnsiballZ_copy.py'
Nov 24 14:21:18 compute-0 sudo[152204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:18 compute-0 python3.9[152206]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:18 compute-0 sudo[152204]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:19 compute-0 sudo[152356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbemntwllqginmjehqkhegrcebvgcicp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994078.8851888-1025-159532541694371/AnsiballZ_copy.py'
Nov 24 14:21:19 compute-0 sudo[152356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:19 compute-0 python3.9[152358]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:19 compute-0 sudo[152356]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:19 compute-0 sudo[152508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjxquwbouuqnijmtpizualuanrlysrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994079.4510796-1025-26521026722714/AnsiballZ_copy.py'
Nov 24 14:21:19 compute-0 sudo[152508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:19 compute-0 python3.9[152510]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:19 compute-0 sudo[152508]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:20 compute-0 sudo[152660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzmbmjjtyvytjnebxsqfsvijwcrnnwzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994080.0172918-1025-241650930141621/AnsiballZ_copy.py'
Nov 24 14:21:20 compute-0 sudo[152660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:20 compute-0 python3.9[152662]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:20 compute-0 sudo[152660]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:20 compute-0 sudo[152812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bucjjdahshzqvkfzafwjonpejewpkocb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994080.6150124-1061-277957993808150/AnsiballZ_systemd.py'
Nov 24 14:21:20 compute-0 sudo[152812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:21 compute-0 python3.9[152814]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:21:21 compute-0 systemd[1]: Reloading.
Nov 24 14:21:21 compute-0 systemd-rc-local-generator[152840]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:21 compute-0 systemd-sysv-generator[152844]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:21 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Nov 24 14:21:21 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Nov 24 14:21:21 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 24 14:21:21 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 24 14:21:21 compute-0 systemd[1]: Starting libvirt logging daemon...
Nov 24 14:21:21 compute-0 systemd[1]: Started libvirt logging daemon.
Nov 24 14:21:21 compute-0 sudo[152812]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:21 compute-0 sudo[153005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wajfgbsxufilqpokiljvktyjwujuirpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994081.6831458-1061-266255613688791/AnsiballZ_systemd.py'
Nov 24 14:21:21 compute-0 sudo[153005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:22 compute-0 python3.9[153007]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:21:22 compute-0 systemd[1]: Reloading.
Nov 24 14:21:22 compute-0 systemd-sysv-generator[153037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:22 compute-0 systemd-rc-local-generator[153033]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:22 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 24 14:21:22 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 24 14:21:22 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 24 14:21:22 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 24 14:21:22 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 24 14:21:22 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 24 14:21:22 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 24 14:21:22 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 24 14:21:22 compute-0 sudo[153005]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:22 compute-0 sudo[153221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxwqqacgjvucchweabpwkaszisupoifb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994082.6989152-1061-238181098608218/AnsiballZ_systemd.py'
Nov 24 14:21:22 compute-0 sudo[153221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:23 compute-0 python3.9[153223]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:21:23 compute-0 systemd[1]: Reloading.
Nov 24 14:21:23 compute-0 systemd-rc-local-generator[153250]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:23 compute-0 systemd-sysv-generator[153254]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:23 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 24 14:21:23 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 24 14:21:23 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 24 14:21:23 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 24 14:21:23 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 24 14:21:23 compute-0 systemd[1]: Starting libvirt proxy daemon...
Nov 24 14:21:23 compute-0 systemd[1]: Started libvirt proxy daemon.
Nov 24 14:21:23 compute-0 sudo[153221]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:23 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 24 14:21:23 compute-0 sudo[153432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpqkoyhxzalvkfyjtqttdfgkpbwvwbsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994083.7151117-1061-223711387414158/AnsiballZ_systemd.py'
Nov 24 14:21:23 compute-0 sudo[153432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:23 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 24 14:21:24 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 24 14:21:24 compute-0 python3.9[153436]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:21:24 compute-0 systemd[1]: Reloading.
Nov 24 14:21:24 compute-0 systemd-rc-local-generator[153470]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:24 compute-0 systemd-sysv-generator[153475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:24 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Nov 24 14:21:24 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 24 14:21:24 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 24 14:21:24 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 24 14:21:24 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 24 14:21:24 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 24 14:21:24 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 24 14:21:24 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 24 14:21:24 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 24 14:21:24 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 24 14:21:24 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 24 14:21:24 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 24 14:21:24 compute-0 sudo[153432]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:24 compute-0 setroubleshoot[153259]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 106dd690-a4d8-4b0c-b902-8e567a272cd3
Nov 24 14:21:24 compute-0 setroubleshoot[153259]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 24 14:21:24 compute-0 setroubleshoot[153259]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 106dd690-a4d8-4b0c-b902-8e567a272cd3
Nov 24 14:21:24 compute-0 setroubleshoot[153259]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Nov 24 14:21:24 compute-0 sudo[153657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjpgpwjwxadctivvkaxynezlkwenbjmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994084.7299895-1061-192966925674385/AnsiballZ_systemd.py'
Nov 24 14:21:24 compute-0 sudo[153657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:25 compute-0 python3.9[153659]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:21:25 compute-0 systemd[1]: Reloading.
Nov 24 14:21:25 compute-0 systemd-rc-local-generator[153686]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:25 compute-0 systemd-sysv-generator[153689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:25 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Nov 24 14:21:25 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Nov 24 14:21:25 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 24 14:21:25 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 24 14:21:25 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 24 14:21:25 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 24 14:21:25 compute-0 systemd[1]: Starting libvirt secret daemon...
Nov 24 14:21:25 compute-0 systemd[1]: Started libvirt secret daemon.
Nov 24 14:21:25 compute-0 sudo[153657]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:26 compute-0 sudo[153869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaqapomyaxjmnbkqxtoekmiyywjyuajk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994085.9101396-1098-47520014048354/AnsiballZ_file.py'
Nov 24 14:21:26 compute-0 sudo[153869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:26 compute-0 python3.9[153871]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:26 compute-0 sudo[153869]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:26 compute-0 sudo[154021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqufdxrjukzrhtxjaizigkleiswwkxyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994086.5205166-1106-221567182655193/AnsiballZ_find.py'
Nov 24 14:21:26 compute-0 sudo[154021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:26 compute-0 python3.9[154023]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 14:21:26 compute-0 sudo[154021]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:27 compute-0 podman[154071]: 2025-11-24 14:21:27.476934305 +0000 UTC m=+0.079676201 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:21:27 compute-0 sudo[154192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnjwkxrlaclaonrcdeyuxdnwyhjcrkex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994087.37445-1120-89953788227910/AnsiballZ_stat.py'
Nov 24 14:21:27 compute-0 sudo[154192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:27 compute-0 python3.9[154194]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:27 compute-0 sudo[154192]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:28 compute-0 sudo[154315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytxwscstyaontvmtdpgtfmixxxbsnjhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994087.37445-1120-89953788227910/AnsiballZ_copy.py'
Nov 24 14:21:28 compute-0 sudo[154315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:28 compute-0 python3.9[154317]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994087.37445-1120-89953788227910/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:28 compute-0 sudo[154315]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:29 compute-0 sudo[154467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhkdczdomktibgzmqlawilqhgxhrhvys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994088.7841747-1136-4198165323948/AnsiballZ_file.py'
Nov 24 14:21:29 compute-0 sudo[154467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:29 compute-0 python3.9[154469]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:29 compute-0 sudo[154467]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:29 compute-0 sudo[154619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjijmcrkizphwrewuslbmxrvahdpkfvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994089.4051404-1144-43976331648085/AnsiballZ_stat.py'
Nov 24 14:21:29 compute-0 sudo[154619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:29 compute-0 python3.9[154621]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:29 compute-0 sudo[154619]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:30 compute-0 sudo[154697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibckbokazrswmsmxtqernqrfduccknnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994089.4051404-1144-43976331648085/AnsiballZ_file.py'
Nov 24 14:21:30 compute-0 sudo[154697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:30 compute-0 python3.9[154699]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:30 compute-0 sudo[154697]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:30 compute-0 sudo[154849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnwsnjuplknlsxwoqljjaypyilxbhony ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994090.4681082-1156-185508062753827/AnsiballZ_stat.py'
Nov 24 14:21:30 compute-0 sudo[154849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:30 compute-0 python3.9[154851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:30 compute-0 sudo[154849]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:31 compute-0 sudo[154927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iajgbfzxwtvazpqobudaafuovijmylut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994090.4681082-1156-185508062753827/AnsiballZ_file.py'
Nov 24 14:21:31 compute-0 sudo[154927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:31 compute-0 python3.9[154929]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.cpnl3u81 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:31 compute-0 sudo[154927]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:31 compute-0 sudo[155079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrstzgfewckeukbkmqzaezbtinzywtbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994091.5349686-1168-31213495976129/AnsiballZ_stat.py'
Nov 24 14:21:31 compute-0 sudo[155079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:32 compute-0 python3.9[155081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:32 compute-0 sudo[155079]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:32 compute-0 sudo[155157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnsctsxameosxditaiviacdczmvtpoci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994091.5349686-1168-31213495976129/AnsiballZ_file.py'
Nov 24 14:21:32 compute-0 sudo[155157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:32 compute-0 python3.9[155159]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:32 compute-0 sudo[155157]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:32 compute-0 sudo[155309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mumcjzvhhqujeipwprabvifluzggpwry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994092.6455295-1181-130222681035536/AnsiballZ_command.py'
Nov 24 14:21:32 compute-0 sudo[155309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:33 compute-0 python3.9[155311]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:21:33 compute-0 sudo[155309]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:33 compute-0 sudo[155462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flkcqwgjfbkfaudwpqfpwwdyihltphga ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994093.3289657-1189-248234412777910/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 14:21:33 compute-0 sudo[155462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:33 compute-0 python3[155464]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 14:21:33 compute-0 sudo[155462]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:34 compute-0 sudo[155614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jycrjbnecnlcjqluylhjvltmxzekwtab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994094.1324756-1197-238023203760031/AnsiballZ_stat.py'
Nov 24 14:21:34 compute-0 sudo[155614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:34 compute-0 python3.9[155616]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:34 compute-0 sudo[155614]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:34 compute-0 sudo[155692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fovrhghitjpimrudoeeiivhvafgbymxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994094.1324756-1197-238023203760031/AnsiballZ_file.py'
Nov 24 14:21:34 compute-0 sudo[155692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:34 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 24 14:21:35 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 24 14:21:35 compute-0 python3.9[155694]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:35 compute-0 sudo[155692]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:35 compute-0 sudo[155844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eanxjgzifpqndbmgjjvayelrtgsxitxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994095.1929965-1209-273144672358453/AnsiballZ_stat.py'
Nov 24 14:21:35 compute-0 sudo[155844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:35 compute-0 python3.9[155846]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:35 compute-0 sudo[155844]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:35 compute-0 sudo[155922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufayatmcfuyznhvghbrlyaexkmuavlkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994095.1929965-1209-273144672358453/AnsiballZ_file.py'
Nov 24 14:21:35 compute-0 sudo[155922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:36 compute-0 python3.9[155924]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:36 compute-0 sudo[155922]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:36 compute-0 sudo[156074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brpofhycwvwkyodoregaxpzvcopdzgkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994096.2062063-1221-18958055895738/AnsiballZ_stat.py'
Nov 24 14:21:36 compute-0 sudo[156074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:36 compute-0 python3.9[156076]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:36 compute-0 sudo[156074]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:36 compute-0 sudo[156152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snjzuppeobuhbprnuwmllpthewxwvghn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994096.2062063-1221-18958055895738/AnsiballZ_file.py'
Nov 24 14:21:36 compute-0 sudo[156152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:37 compute-0 python3.9[156154]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:37 compute-0 sudo[156152]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:37 compute-0 sudo[156324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouxofaxbnzsvjdlmpyflupoxswxqxfhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994097.3441284-1233-16020245175905/AnsiballZ_stat.py'
Nov 24 14:21:37 compute-0 sudo[156324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:37 compute-0 podman[156279]: 2025-11-24 14:21:37.688751512 +0000 UTC m=+0.086626313 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:21:37 compute-0 python3.9[156330]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:37 compute-0 sudo[156324]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:38 compute-0 sudo[156410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjsseijwjmrzopqhktjynzuudoxjfvzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994097.3441284-1233-16020245175905/AnsiballZ_file.py'
Nov 24 14:21:38 compute-0 sudo[156410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:38 compute-0 python3.9[156412]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:38 compute-0 sudo[156410]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:38 compute-0 sudo[156562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dedhhbfqgvzhpyihqmjckrdorgdsosyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994098.5047097-1245-134174613165132/AnsiballZ_stat.py'
Nov 24 14:21:38 compute-0 sudo[156562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:39 compute-0 python3.9[156564]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:39 compute-0 sudo[156562]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:39 compute-0 sudo[156687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocymtyqgvzudvolkbpfpmexcwgmdftno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994098.5047097-1245-134174613165132/AnsiballZ_copy.py'
Nov 24 14:21:39 compute-0 sudo[156687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:39 compute-0 python3.9[156689]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994098.5047097-1245-134174613165132/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:39 compute-0 sudo[156687]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:40 compute-0 sudo[156839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqnyuxahunsmkqfobrxwxxgmviaspgrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994099.7622364-1260-26508941352658/AnsiballZ_file.py'
Nov 24 14:21:40 compute-0 sudo[156839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:40 compute-0 python3.9[156841]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:40 compute-0 sudo[156839]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:40 compute-0 sudo[156991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otmrfkvmyoqoibamimvxdnrizglulhtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994100.3811295-1268-24079724517518/AnsiballZ_command.py'
Nov 24 14:21:40 compute-0 sudo[156991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:40 compute-0 python3.9[156993]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:21:40 compute-0 sudo[156991]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:41 compute-0 sudo[157146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niwdolkndyktddaldocvlyniheaaklbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994101.1014884-1276-89135958467748/AnsiballZ_blockinfile.py'
Nov 24 14:21:41 compute-0 sudo[157146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:41 compute-0 python3.9[157148]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:41 compute-0 sudo[157146]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:42 compute-0 sudo[157298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pazmgyfndwfdqwmsckuewnjxhodfpyck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994101.9111612-1285-245271376378768/AnsiballZ_command.py'
Nov 24 14:21:42 compute-0 sudo[157298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:42 compute-0 python3.9[157300]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:21:42 compute-0 sudo[157298]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:42 compute-0 sudo[157451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkaiojjtcnzmlcvuctkugnyaifphhte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994102.5518885-1293-256818336928589/AnsiballZ_stat.py'
Nov 24 14:21:42 compute-0 sudo[157451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:43 compute-0 python3.9[157453]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:21:43 compute-0 sudo[157451]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:43 compute-0 sudo[157605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jswooptwmjsmbwdxurmwjgsnfqozvqed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994103.2135472-1301-153199273316757/AnsiballZ_command.py'
Nov 24 14:21:43 compute-0 sudo[157605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:43 compute-0 python3.9[157607]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:21:43 compute-0 sudo[157605]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:44 compute-0 sudo[157760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uijclzsigwvykapkhehvaspunblxnrbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994103.844817-1309-149145649593878/AnsiballZ_file.py'
Nov 24 14:21:44 compute-0 sudo[157760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:44 compute-0 python3.9[157762]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:44 compute-0 sudo[157760]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:44 compute-0 sudo[157912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgwsxraofufksfgfmfcthlxzejqgpkep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994104.4390461-1317-153878699321239/AnsiballZ_stat.py'
Nov 24 14:21:44 compute-0 sudo[157912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:44 compute-0 python3.9[157914]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:44 compute-0 sudo[157912]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:45 compute-0 sudo[158035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlkztpbpzdjpwhmmqsylytbruqzpoctj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994104.4390461-1317-153878699321239/AnsiballZ_copy.py'
Nov 24 14:21:45 compute-0 sudo[158035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:45 compute-0 python3.9[158037]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994104.4390461-1317-153878699321239/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:45 compute-0 sudo[158035]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:45 compute-0 sudo[158187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apgcerayfbxthdldqoahssvgeourpajk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994105.5400565-1332-184368199846208/AnsiballZ_stat.py'
Nov 24 14:21:45 compute-0 sudo[158187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:46 compute-0 python3.9[158189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:46 compute-0 sudo[158187]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:46 compute-0 sudo[158310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btnbunlwvddsfkololkkkpwifaoirarq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994105.5400565-1332-184368199846208/AnsiballZ_copy.py'
Nov 24 14:21:46 compute-0 sudo[158310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:46 compute-0 python3.9[158312]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994105.5400565-1332-184368199846208/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:46 compute-0 sudo[158310]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:47 compute-0 sudo[158462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqndfjvsgncvhfvjdxrpkzwqyzzqxnji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994106.7962086-1347-232144437625045/AnsiballZ_stat.py'
Nov 24 14:21:47 compute-0 sudo[158462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:47 compute-0 python3.9[158464]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:21:47 compute-0 sudo[158462]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:47 compute-0 sudo[158585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjxtbwcuwmqacjrhdukznxijcadfcgbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994106.7962086-1347-232144437625045/AnsiballZ_copy.py'
Nov 24 14:21:47 compute-0 sudo[158585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:47 compute-0 python3.9[158587]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994106.7962086-1347-232144437625045/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:21:47 compute-0 sudo[158585]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:48 compute-0 sudo[158737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftpuxsrkuclmhmhfxthjhersrdxkwvka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994108.1742065-1362-262498804303382/AnsiballZ_systemd.py'
Nov 24 14:21:48 compute-0 sudo[158737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:48 compute-0 python3.9[158739]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:21:48 compute-0 systemd[1]: Reloading.
Nov 24 14:21:48 compute-0 systemd-rc-local-generator[158762]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:48 compute-0 systemd-sysv-generator[158767]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:49 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Nov 24 14:21:49 compute-0 sudo[158737]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:49 compute-0 sudo[158927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecdbfzifwuapvpfeqfqzrewbgwplvnpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994109.2631264-1370-5282613638642/AnsiballZ_systemd.py'
Nov 24 14:21:49 compute-0 sudo[158927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:21:49 compute-0 python3.9[158929]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 24 14:21:49 compute-0 systemd[1]: Reloading.
Nov 24 14:21:49 compute-0 systemd-rc-local-generator[158956]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:49 compute-0 systemd-sysv-generator[158960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:50 compute-0 systemd[1]: Reloading.
Nov 24 14:21:50 compute-0 systemd-sysv-generator[158999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:21:50 compute-0 systemd-rc-local-generator[158996]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:21:50 compute-0 sudo[158927]: pam_unix(sudo:session): session closed for user root
Nov 24 14:21:50 compute-0 sshd-session[104592]: Connection closed by 192.168.122.30 port 44458
Nov 24 14:21:50 compute-0 sshd-session[104589]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:21:50 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Nov 24 14:21:50 compute-0 systemd[1]: session-23.scope: Consumed 3min 9.570s CPU time.
Nov 24 14:21:50 compute-0 systemd-logind[807]: Session 23 logged out. Waiting for processes to exit.
Nov 24 14:21:50 compute-0 systemd-logind[807]: Removed session 23.
Nov 24 14:21:56 compute-0 sshd-session[159028]: Accepted publickey for zuul from 192.168.122.30 port 40430 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:21:56 compute-0 systemd-logind[807]: New session 24 of user zuul.
Nov 24 14:21:56 compute-0 systemd[1]: Started Session 24 of User zuul.
Nov 24 14:21:56 compute-0 sshd-session[159028]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:21:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:21:56.646 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:21:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:21:56.648 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:21:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:21:56.648 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:21:57 compute-0 python3.9[159181]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:21:58 compute-0 podman[159309]: 2025-11-24 14:21:58.391838156 +0000 UTC m=+0.066084721 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Nov 24 14:21:58 compute-0 python3.9[159344]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:21:58 compute-0 network[159372]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:21:58 compute-0 network[159373]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:21:58 compute-0 network[159374]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:22:02 compute-0 sudo[159643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqpntaednyczecucheojxkqcpgxvpklg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994121.9729242-47-14751793527353/AnsiballZ_setup.py'
Nov 24 14:22:02 compute-0 sudo[159643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:02 compute-0 python3.9[159645]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 24 14:22:02 compute-0 sudo[159643]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:03 compute-0 sudo[159727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkxytimunucunwquqkasnbtmciqxpjzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994121.9729242-47-14751793527353/AnsiballZ_dnf.py'
Nov 24 14:22:03 compute-0 sudo[159727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:03 compute-0 python3.9[159729]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:22:08 compute-0 podman[159731]: 2025-11-24 14:22:08.47976766 +0000 UTC m=+0.083902521 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:22:08 compute-0 sudo[159727]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:09 compute-0 sudo[159905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xefurysxzrmocpvkzjnbbtawtdczeiuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994128.6306548-59-198185767760033/AnsiballZ_stat.py'
Nov 24 14:22:09 compute-0 sudo[159905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:09 compute-0 python3.9[159907]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:09 compute-0 sudo[159905]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:09 compute-0 sudo[160057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-poknhoniwgerigjzriwcqghkfptgnwqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994129.4092903-69-205227197660375/AnsiballZ_command.py'
Nov 24 14:22:09 compute-0 sudo[160057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:09 compute-0 python3.9[160059]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:22:10 compute-0 sudo[160057]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:10 compute-0 sudo[160210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stqazcfwkcapsdmlyphytfmcanlmxmkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994130.2570794-79-100540489463069/AnsiballZ_stat.py'
Nov 24 14:22:10 compute-0 sudo[160210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:10 compute-0 python3.9[160212]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:10 compute-0 sudo[160210]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:11 compute-0 sudo[160362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xteasbajcnyeciolmodftgicmovqbkfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994130.8671358-87-243969850680375/AnsiballZ_command.py'
Nov 24 14:22:11 compute-0 sudo[160362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:11 compute-0 python3.9[160364]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:22:11 compute-0 sudo[160362]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:11 compute-0 sudo[160515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-virsxftefwsmnvajyxlgtlnjovuqssto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994131.6873016-95-133691142263452/AnsiballZ_stat.py'
Nov 24 14:22:11 compute-0 sudo[160515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:12 compute-0 python3.9[160517]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:12 compute-0 sudo[160515]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:12 compute-0 sudo[160638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqmofrmdnncaatzdezgyooryawlyudfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994131.6873016-95-133691142263452/AnsiballZ_copy.py'
Nov 24 14:22:12 compute-0 sudo[160638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:12 compute-0 python3.9[160640]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994131.6873016-95-133691142263452/.source.iscsi _original_basename=.o7b306ih follow=False checksum=5d5c7c8c7fe9742c93d29915f1ef1c8ef099bc1c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:12 compute-0 sudo[160638]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:13 compute-0 sudo[160790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-envolgljrnxexuspsmxdqzxkfnbhffnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994132.9708884-110-280208059432105/AnsiballZ_file.py'
Nov 24 14:22:13 compute-0 sudo[160790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:13 compute-0 python3.9[160792]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:13 compute-0 sudo[160790]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:14 compute-0 sudo[160942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rspzhnzmpnptulpbhaowswzirsbrawjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994133.7183385-118-30105191008983/AnsiballZ_lineinfile.py'
Nov 24 14:22:14 compute-0 sudo[160942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:14 compute-0 python3.9[160944]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:14 compute-0 sudo[160942]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:14 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:22:14 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:22:15 compute-0 sudo[161095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etwnsdlijsclwuzpraopscyvodumrtpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994134.5130281-127-232466369302494/AnsiballZ_systemd_service.py'
Nov 24 14:22:15 compute-0 sudo[161095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:15 compute-0 python3.9[161097]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:22:15 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 24 14:22:15 compute-0 sudo[161095]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:15 compute-0 sudo[161251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzeasrthmisnjbjkbzuzyozdsbkwfrvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994135.5522208-135-5949120661158/AnsiballZ_systemd_service.py'
Nov 24 14:22:15 compute-0 sudo[161251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:16 compute-0 python3.9[161253]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:22:16 compute-0 systemd[1]: Reloading.
Nov 24 14:22:16 compute-0 systemd-sysv-generator[161285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:22:16 compute-0 systemd-rc-local-generator[161281]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:22:16 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 24 14:22:16 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 24 14:22:16 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Nov 24 14:22:16 compute-0 systemd[1]: Started Open-iSCSI.
Nov 24 14:22:16 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 24 14:22:16 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 24 14:22:16 compute-0 sudo[161251]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:17 compute-0 sudo[161452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlwibzewhzfgnwyxzljpdzjrdfxeyfwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994136.83378-146-15893165704770/AnsiballZ_service_facts.py'
Nov 24 14:22:17 compute-0 sudo[161452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:17 compute-0 python3.9[161454]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:22:17 compute-0 network[161471]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:22:17 compute-0 network[161472]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:22:17 compute-0 network[161473]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:22:20 compute-0 sudo[161452]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:20 compute-0 sudo[161742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvmylkyryimcbjfcglgdduersraopvyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994140.2835438-156-82333965676106/AnsiballZ_file.py'
Nov 24 14:22:20 compute-0 sudo[161742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:20 compute-0 python3.9[161744]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 14:22:20 compute-0 sudo[161742]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:21 compute-0 sudo[161894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfkunsvjsrfoutxbsysyxpikrlgpeyeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994140.9031-164-130104072782604/AnsiballZ_modprobe.py'
Nov 24 14:22:21 compute-0 sudo[161894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:21 compute-0 python3.9[161896]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 24 14:22:21 compute-0 sudo[161894]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:21 compute-0 sudo[162050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zbqlfuzekjlddyzsicivwnxlokkwiekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994141.6710768-172-98323110784842/AnsiballZ_stat.py'
Nov 24 14:22:21 compute-0 sudo[162050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:22 compute-0 python3.9[162052]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:22 compute-0 sudo[162050]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:22 compute-0 sudo[162173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faggmixoylsrrsdzecpvmwszciuxkuqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994141.6710768-172-98323110784842/AnsiballZ_copy.py'
Nov 24 14:22:22 compute-0 sudo[162173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:22 compute-0 python3.9[162175]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994141.6710768-172-98323110784842/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:22 compute-0 sudo[162173]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:23 compute-0 sudo[162325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtycsxqidgzambnruasdhapmbdtdmgdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994142.7928512-188-153372022986356/AnsiballZ_lineinfile.py'
Nov 24 14:22:23 compute-0 sudo[162325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:23 compute-0 python3.9[162327]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:23 compute-0 sudo[162325]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:23 compute-0 sudo[162477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usnegszykkwxbilzyyjmfmtfwtswyzfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994143.3696222-196-100686466543027/AnsiballZ_systemd.py'
Nov 24 14:22:23 compute-0 sudo[162477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:24 compute-0 python3.9[162479]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:22:24 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 24 14:22:24 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 24 14:22:24 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 24 14:22:24 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 24 14:22:24 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 24 14:22:24 compute-0 sudo[162477]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:24 compute-0 sudo[162633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-styhjbtimvgcfhtqdalbczwuosloxcao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994144.481279-204-156709701167702/AnsiballZ_file.py'
Nov 24 14:22:24 compute-0 sudo[162633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:24 compute-0 python3.9[162635]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:24 compute-0 sudo[162633]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:25 compute-0 sudo[162785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttivemqyhgviqbfpcksgibzzuvywcaxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994145.150546-213-223155440601433/AnsiballZ_stat.py'
Nov 24 14:22:25 compute-0 sudo[162785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:25 compute-0 python3.9[162787]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:25 compute-0 sudo[162785]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:26 compute-0 sudo[162937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjqcebbaxfwvhinvjaegmimxzaajbppg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994145.7737937-222-28420148699893/AnsiballZ_stat.py'
Nov 24 14:22:26 compute-0 sudo[162937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:26 compute-0 python3.9[162939]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:26 compute-0 sudo[162937]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:26 compute-0 sudo[163089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isteuhioecigztyzemakgrpsrrqvujvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994146.4924262-230-22281086581612/AnsiballZ_stat.py'
Nov 24 14:22:26 compute-0 sudo[163089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:26 compute-0 python3.9[163091]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:26 compute-0 sudo[163089]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:27 compute-0 sudo[163213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aojmkwiapdhfpferxbumasafrejcltkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994146.4924262-230-22281086581612/AnsiballZ_copy.py'
Nov 24 14:22:27 compute-0 sudo[163213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:27 compute-0 python3.9[163215]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994146.4924262-230-22281086581612/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:27 compute-0 sudo[163213]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:27 compute-0 sudo[163365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azjwkmpadsjmrjgbcwqrkrpgzbxwuxmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994147.6158037-245-253011093877590/AnsiballZ_command.py'
Nov 24 14:22:27 compute-0 sudo[163365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:28 compute-0 python3.9[163367]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:22:28 compute-0 sudo[163365]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:28 compute-0 sudo[163529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umcjtxtwocbjabtkvpqdyolirhqyqtwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994148.2247648-253-59306979719007/AnsiballZ_lineinfile.py'
Nov 24 14:22:28 compute-0 sudo[163529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:28 compute-0 podman[163492]: 2025-11-24 14:22:28.486539021 +0000 UTC m=+0.049026856 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:22:28 compute-0 python3.9[163538]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:28 compute-0 sudo[163529]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:29 compute-0 sudo[163690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zajspxsxodkkvgjaiccpbeddchinwggh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994148.8156238-261-77754818642349/AnsiballZ_replace.py'
Nov 24 14:22:29 compute-0 sudo[163690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:29 compute-0 python3.9[163692]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:29 compute-0 sudo[163690]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:29 compute-0 sudo[163842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agzcjwwyhyagqyirgdlytcoirqqbetry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994149.6115334-269-92065607413000/AnsiballZ_replace.py'
Nov 24 14:22:29 compute-0 sudo[163842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:30 compute-0 python3.9[163844]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:30 compute-0 sudo[163842]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:30 compute-0 sudo[163994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrjnhxqdtkudsobhhsaskhlobmnjqhle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994150.208655-278-105461417149001/AnsiballZ_lineinfile.py'
Nov 24 14:22:30 compute-0 sudo[163994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:30 compute-0 python3.9[163996]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:30 compute-0 sudo[163994]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:31 compute-0 sudo[164146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spyweinvhcwycextktytsagsuanbscfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994150.7758656-278-19651347595772/AnsiballZ_lineinfile.py'
Nov 24 14:22:31 compute-0 sudo[164146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:31 compute-0 python3.9[164148]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:31 compute-0 sudo[164146]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:31 compute-0 sudo[164298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrsaojlrkhopywxbynijpowycunoaccc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994151.3407445-278-48569818587920/AnsiballZ_lineinfile.py'
Nov 24 14:22:31 compute-0 sudo[164298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:31 compute-0 python3.9[164300]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:31 compute-0 sudo[164298]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:32 compute-0 sudo[164450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssufcuccetaurxmifhymikuylheprwgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994151.9857929-278-266668742944080/AnsiballZ_lineinfile.py'
Nov 24 14:22:32 compute-0 sudo[164450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:32 compute-0 python3.9[164452]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:32 compute-0 sudo[164450]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:32 compute-0 sudo[164602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sacathemnsbvmwbxoonnluznmoapuntq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994152.6328852-307-254898135440945/AnsiballZ_stat.py'
Nov 24 14:22:32 compute-0 sudo[164602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:33 compute-0 python3.9[164604]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:33 compute-0 sudo[164602]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:33 compute-0 sudo[164756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeikbfiyapaojmashakylhjooaxdfozj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994153.2540216-315-154476333369963/AnsiballZ_file.py'
Nov 24 14:22:33 compute-0 sudo[164756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:33 compute-0 python3.9[164758]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:33 compute-0 sudo[164756]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:34 compute-0 sudo[164908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuqtegzohaljjejidfhrmcfjmebnqenw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994153.9205372-324-217975309686501/AnsiballZ_file.py'
Nov 24 14:22:34 compute-0 sudo[164908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:34 compute-0 python3.9[164910]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:34 compute-0 sudo[164908]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:34 compute-0 sudo[165060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgmkwbopkxpuaojkmohrilnldxhgqrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994154.539326-332-27219771351847/AnsiballZ_stat.py'
Nov 24 14:22:34 compute-0 sudo[165060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:35 compute-0 python3.9[165062]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:35 compute-0 sudo[165060]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:35 compute-0 sudo[165138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alnormsiojnheehvavvfxjlsraxxjcvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994154.539326-332-27219771351847/AnsiballZ_file.py'
Nov 24 14:22:35 compute-0 sudo[165138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:35 compute-0 python3.9[165140]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:35 compute-0 sudo[165138]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:36 compute-0 sudo[165290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpuqcuxjfrvmpohuxcgmmrmebdjlfxoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994155.7913399-332-64114868539144/AnsiballZ_stat.py'
Nov 24 14:22:36 compute-0 sudo[165290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:36 compute-0 python3.9[165292]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:36 compute-0 sudo[165290]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:36 compute-0 sudo[165368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjzzhmotrtdclhfxyjbgjhkwfoshcrsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994155.7913399-332-64114868539144/AnsiballZ_file.py'
Nov 24 14:22:36 compute-0 sudo[165368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:36 compute-0 python3.9[165370]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:36 compute-0 sudo[165368]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:37 compute-0 sudo[165520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbvltyczfftvjaxjbwrdxndqmfijixlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994156.8285284-355-192059559613186/AnsiballZ_file.py'
Nov 24 14:22:37 compute-0 sudo[165520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:37 compute-0 python3.9[165522]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:37 compute-0 sudo[165520]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:37 compute-0 sudo[165672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzmrmzbbrbifyvicbziymazuhccvqtpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994157.415413-363-201706098813013/AnsiballZ_stat.py'
Nov 24 14:22:37 compute-0 sudo[165672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:37 compute-0 python3.9[165674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:37 compute-0 sudo[165672]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:38 compute-0 sudo[165750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qebqoupjxoiozrdbtqzckotlcuyyfspz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994157.415413-363-201706098813013/AnsiballZ_file.py'
Nov 24 14:22:38 compute-0 sudo[165750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:38 compute-0 python3.9[165752]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:38 compute-0 sudo[165750]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:38 compute-0 sudo[165916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czhmlwoomplcytdguxdmkldjxollbqrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994158.547179-375-31527465212188/AnsiballZ_stat.py'
Nov 24 14:22:38 compute-0 sudo[165916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:38 compute-0 podman[165876]: 2025-11-24 14:22:38.892064609 +0000 UTC m=+0.078585800 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 14:22:39 compute-0 python3.9[165923]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:39 compute-0 sudo[165916]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:39 compute-0 sudo[166006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqqbmotsoawclzwvxuhkasptwfdrhhmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994158.547179-375-31527465212188/AnsiballZ_file.py'
Nov 24 14:22:39 compute-0 sudo[166006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:39 compute-0 python3.9[166008]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:39 compute-0 sudo[166006]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:40 compute-0 sudo[166158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yykfhngeoyctpspwtwyrbitwwcqqcynk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994159.8723962-387-17276611686288/AnsiballZ_systemd.py'
Nov 24 14:22:40 compute-0 sudo[166158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:40 compute-0 python3.9[166160]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:22:40 compute-0 systemd[1]: Reloading.
Nov 24 14:22:40 compute-0 systemd-sysv-generator[166190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:22:40 compute-0 systemd-rc-local-generator[166186]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:22:40 compute-0 sudo[166158]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:41 compute-0 sudo[166346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hphytavehdqapipkrxyohvcykbrzqflm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994160.9805562-395-253270716399537/AnsiballZ_stat.py'
Nov 24 14:22:41 compute-0 sudo[166346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:41 compute-0 python3.9[166348]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:41 compute-0 sudo[166346]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:41 compute-0 sudo[166424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrijfujggnicnkdelfhakrghlmjolubz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994160.9805562-395-253270716399537/AnsiballZ_file.py'
Nov 24 14:22:41 compute-0 sudo[166424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:41 compute-0 python3.9[166426]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:41 compute-0 sudo[166424]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:42 compute-0 sudo[166576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cciujapykvhgxahkkywgzvvzybxaeawj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994162.0218785-407-112644740005389/AnsiballZ_stat.py'
Nov 24 14:22:42 compute-0 sudo[166576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:42 compute-0 python3.9[166578]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:42 compute-0 sudo[166576]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:42 compute-0 sudo[166654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dunwxtihhhygvxuqnsxbgjxayrjbpnke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994162.0218785-407-112644740005389/AnsiballZ_file.py'
Nov 24 14:22:42 compute-0 sudo[166654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:42 compute-0 python3.9[166656]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:42 compute-0 sudo[166654]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:43 compute-0 sudo[166806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofznzguvfpjevwdxytyisveyhltyybjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994163.0139623-419-100695200912046/AnsiballZ_systemd.py'
Nov 24 14:22:43 compute-0 sudo[166806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:43 compute-0 python3.9[166808]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:22:43 compute-0 systemd[1]: Reloading.
Nov 24 14:22:43 compute-0 systemd-rc-local-generator[166834]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:22:43 compute-0 systemd-sysv-generator[166839]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:22:43 compute-0 systemd[1]: Starting Create netns directory...
Nov 24 14:22:43 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 24 14:22:43 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 24 14:22:43 compute-0 systemd[1]: Finished Create netns directory.
Nov 24 14:22:43 compute-0 sudo[166806]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:44 compute-0 sudo[166999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjvlqjeolgeliwxcxknlcgllyqqxffln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994164.2720516-429-258138435050678/AnsiballZ_file.py'
Nov 24 14:22:44 compute-0 sudo[166999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:44 compute-0 python3.9[167001]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:44 compute-0 sudo[166999]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:45 compute-0 sudo[167151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atoystvpwcnapzrpvxndubvtviegcmbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994164.9551864-437-47580182885469/AnsiballZ_stat.py'
Nov 24 14:22:45 compute-0 sudo[167151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:45 compute-0 python3.9[167153]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:45 compute-0 sudo[167151]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:45 compute-0 sudo[167274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgfpqlllfuekiumbvpxzmwplitoouqdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994164.9551864-437-47580182885469/AnsiballZ_copy.py'
Nov 24 14:22:45 compute-0 sudo[167274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:45 compute-0 python3.9[167276]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994164.9551864-437-47580182885469/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:46 compute-0 sudo[167274]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:46 compute-0 sudo[167426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wamwvnmlnkouvrwbueqfaqlohtwnuyef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994166.3276567-454-230157803589366/AnsiballZ_file.py'
Nov 24 14:22:46 compute-0 sudo[167426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:46 compute-0 python3.9[167428]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:22:46 compute-0 sudo[167426]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:47 compute-0 sudo[167578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-segaiazgwtixzrwxgjasedftagmqubpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994166.9897304-462-39845527748367/AnsiballZ_stat.py'
Nov 24 14:22:47 compute-0 sudo[167578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:47 compute-0 python3.9[167580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:22:47 compute-0 sudo[167578]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:47 compute-0 sudo[167701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqrhlkxzemusqdmsbxzepmufytebucfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994166.9897304-462-39845527748367/AnsiballZ_copy.py'
Nov 24 14:22:47 compute-0 sudo[167701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:48 compute-0 python3.9[167703]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994166.9897304-462-39845527748367/.source.json _original_basename=.1s148v9x follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:48 compute-0 sudo[167701]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:48 compute-0 sudo[167853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vniplpecgsbffabyycuzieyjwdmulxmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994168.2331939-477-194467454489898/AnsiballZ_file.py'
Nov 24 14:22:48 compute-0 sudo[167853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:48 compute-0 python3.9[167855]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:48 compute-0 sudo[167853]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:49 compute-0 sudo[168005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzxbzgnxgjedblkznpcofgifahwbrxzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994168.9249368-485-84629852626775/AnsiballZ_stat.py'
Nov 24 14:22:49 compute-0 sudo[168005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:49 compute-0 sudo[168005]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:49 compute-0 sudo[168128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdhpyfteoolodtsgkixwqhkpturfazgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994168.9249368-485-84629852626775/AnsiballZ_copy.py'
Nov 24 14:22:49 compute-0 sudo[168128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:49 compute-0 sudo[168128]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:50 compute-0 sudo[168280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctntummsyxeevfwzslzvfdniposggqan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994170.1886265-502-155343993562435/AnsiballZ_container_config_data.py'
Nov 24 14:22:50 compute-0 sudo[168280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:50 compute-0 python3.9[168282]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 24 14:22:50 compute-0 sudo[168280]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:51 compute-0 sudo[168432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjslpqxvqxdarkswdozqpbrvvslkxqoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994171.03045-511-192060520402126/AnsiballZ_container_config_hash.py'
Nov 24 14:22:51 compute-0 sudo[168432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:51 compute-0 python3.9[168434]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:22:51 compute-0 sudo[168432]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:52 compute-0 sudo[168584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uohtwwkqwswnzihnpsxkhuauepxjanqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994171.89565-520-64180203927528/AnsiballZ_podman_container_info.py'
Nov 24 14:22:52 compute-0 sudo[168584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:52 compute-0 python3.9[168586]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 24 14:22:52 compute-0 sudo[168584]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:53 compute-0 sudo[168762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrtuldpngqawhpztzwzfrbcmaixrdwlr ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994173.0643485-533-58042031289455/AnsiballZ_edpm_container_manage.py'
Nov 24 14:22:53 compute-0 sudo[168762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:53 compute-0 python3[168764]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:22:53 compute-0 podman[168799]: 2025-11-24 14:22:53.948129705 +0000 UTC m=+0.041100032 container create 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 14:22:53 compute-0 podman[168799]: 2025-11-24 14:22:53.926423924 +0000 UTC m=+0.019394271 image pull 5a87eb2d1bea5c4c3bce654551fc0b05a96cf5556b36110e17bddeee8189b072 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24
Nov 24 14:22:53 compute-0 python3[168764]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24
Nov 24 14:22:54 compute-0 sudo[168762]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:54 compute-0 sudo[168987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kihfglykslpewcuxgsoafrmfknlvtzxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994174.2621036-541-29165831535552/AnsiballZ_stat.py'
Nov 24 14:22:54 compute-0 sudo[168987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:54 compute-0 python3.9[168989]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:54 compute-0 sudo[168987]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:55 compute-0 sudo[169141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eufmgsjlehlktqmkbhrrgmalbxemhzws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994174.9630752-550-41299300344911/AnsiballZ_file.py'
Nov 24 14:22:55 compute-0 sudo[169141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:55 compute-0 python3.9[169143]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:55 compute-0 sudo[169141]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:55 compute-0 sudo[169217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmtwtuhwfylcncqodsqrxuokascwqamu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994174.9630752-550-41299300344911/AnsiballZ_stat.py'
Nov 24 14:22:55 compute-0 sudo[169217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:55 compute-0 python3.9[169219]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:55 compute-0 sudo[169217]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:56 compute-0 sudo[169368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wigeziedyhxgsknjeiphtenkqdzrmdxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994175.8971179-550-222511125205066/AnsiballZ_copy.py'
Nov 24 14:22:56 compute-0 sudo[169368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:56 compute-0 python3.9[169370]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763994175.8971179-550-222511125205066/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:22:56 compute-0 sudo[169368]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:22:56.647 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:22:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:22:56.650 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:22:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:22:56.651 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:22:56 compute-0 sudo[169444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njqjopzhbiwryiwvqroaifworvmohroi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994175.8971179-550-222511125205066/AnsiballZ_systemd.py'
Nov 24 14:22:56 compute-0 sudo[169444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:57 compute-0 python3.9[169446]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:22:57 compute-0 systemd[1]: Reloading.
Nov 24 14:22:57 compute-0 systemd-sysv-generator[169476]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:22:57 compute-0 systemd-rc-local-generator[169473]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:22:57 compute-0 sudo[169444]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:57 compute-0 sudo[169555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eysfkwoyzobykkmzkkozdwablzmdsppr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994175.8971179-550-222511125205066/AnsiballZ_systemd.py'
Nov 24 14:22:57 compute-0 sudo[169555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:57 compute-0 python3.9[169557]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:22:57 compute-0 systemd[1]: Reloading.
Nov 24 14:22:58 compute-0 systemd-rc-local-generator[169586]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:22:58 compute-0 systemd-sysv-generator[169591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:22:58 compute-0 systemd[1]: Starting multipathd container...
Nov 24 14:22:58 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:22:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a32344f8e2fd53cb8d636575688a9eaef59fe67d2f1fa77c042839e0429b41/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 14:22:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a32344f8e2fd53cb8d636575688a9eaef59fe67d2f1fa77c042839e0429b41/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 14:22:58 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.
Nov 24 14:22:58 compute-0 podman[169597]: 2025-11-24 14:22:58.331293521 +0000 UTC m=+0.111419847 container init 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:22:58 compute-0 multipathd[169613]: + sudo -E kolla_set_configs
Nov 24 14:22:58 compute-0 podman[169597]: 2025-11-24 14:22:58.358123403 +0000 UTC m=+0.138249719 container start 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:22:58 compute-0 podman[169597]: multipathd
Nov 24 14:22:58 compute-0 sudo[169619]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 14:22:58 compute-0 sudo[169619]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:22:58 compute-0 sudo[169619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 14:22:58 compute-0 systemd[1]: Started multipathd container.
Nov 24 14:22:58 compute-0 sudo[169555]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:58 compute-0 multipathd[169613]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:22:58 compute-0 multipathd[169613]: INFO:__main__:Validating config file
Nov 24 14:22:58 compute-0 multipathd[169613]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:22:58 compute-0 multipathd[169613]: INFO:__main__:Writing out command to execute
Nov 24 14:22:58 compute-0 sudo[169619]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:58 compute-0 multipathd[169613]: ++ cat /run_command
Nov 24 14:22:58 compute-0 multipathd[169613]: + CMD='/usr/sbin/multipathd -d'
Nov 24 14:22:58 compute-0 multipathd[169613]: + ARGS=
Nov 24 14:22:58 compute-0 multipathd[169613]: + sudo kolla_copy_cacerts
Nov 24 14:22:58 compute-0 podman[169620]: 2025-11-24 14:22:58.417384013 +0000 UTC m=+0.050130334 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 24 14:22:58 compute-0 systemd[1]: 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c-716ab7fd64c7252b.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:22:58 compute-0 systemd[1]: 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c-716ab7fd64c7252b.service: Failed with result 'exit-code'.
Nov 24 14:22:58 compute-0 sudo[169643]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 14:22:58 compute-0 sudo[169643]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:22:58 compute-0 sudo[169643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 14:22:58 compute-0 sudo[169643]: pam_unix(sudo:session): session closed for user root
Nov 24 14:22:58 compute-0 multipathd[169613]: + [[ ! -n '' ]]
Nov 24 14:22:58 compute-0 multipathd[169613]: + . kolla_extend_start
Nov 24 14:22:58 compute-0 multipathd[169613]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 24 14:22:58 compute-0 multipathd[169613]: Running command: '/usr/sbin/multipathd -d'
Nov 24 14:22:58 compute-0 multipathd[169613]: + umask 0022
Nov 24 14:22:58 compute-0 multipathd[169613]: + exec /usr/sbin/multipathd -d
Nov 24 14:22:58 compute-0 multipathd[169613]: 2445.932095 | --------start up--------
Nov 24 14:22:58 compute-0 multipathd[169613]: 2445.932112 | read /etc/multipath.conf
Nov 24 14:22:58 compute-0 multipathd[169613]: 2445.937081 | path checkers start up
Nov 24 14:22:58 compute-0 podman[169777]: 2025-11-24 14:22:58.829492543 +0000 UTC m=+0.047970580 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:22:58 compute-0 python3.9[169815]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:22:59 compute-0 sudo[169973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muqsuopbjbopdakpoywlawbcckllhqni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994179.162684-586-86412529590085/AnsiballZ_command.py'
Nov 24 14:22:59 compute-0 sudo[169973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:22:59 compute-0 python3.9[169975]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:22:59 compute-0 sudo[169973]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:00 compute-0 sudo[170138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aumaxenqjqvzzucmmymlpygfwgstexzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994179.8046622-594-38288699620947/AnsiballZ_systemd.py'
Nov 24 14:23:00 compute-0 sudo[170138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:00 compute-0 python3.9[170140]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:23:00 compute-0 systemd[1]: Stopping multipathd container...
Nov 24 14:23:00 compute-0 multipathd[169613]: 2448.002700 | exit (signal)
Nov 24 14:23:00 compute-0 multipathd[169613]: 2448.003355 | --------shut down-------
Nov 24 14:23:00 compute-0 systemd[1]: libpod-8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.scope: Deactivated successfully.
Nov 24 14:23:00 compute-0 podman[170144]: 2025-11-24 14:23:00.545030201 +0000 UTC m=+0.077054511 container stop 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:23:00 compute-0 podman[170144]: 2025-11-24 14:23:00.574864761 +0000 UTC m=+0.106889031 container died 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:23:00 compute-0 systemd[1]: 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c-716ab7fd64c7252b.timer: Deactivated successfully.
Nov 24 14:23:00 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.
Nov 24 14:23:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c-userdata-shm.mount: Deactivated successfully.
Nov 24 14:23:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-e3a32344f8e2fd53cb8d636575688a9eaef59fe67d2f1fa77c042839e0429b41-merged.mount: Deactivated successfully.
Nov 24 14:23:00 compute-0 podman[170144]: 2025-11-24 14:23:00.619627927 +0000 UTC m=+0.151652207 container cleanup 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0)
Nov 24 14:23:00 compute-0 podman[170144]: multipathd
Nov 24 14:23:00 compute-0 podman[170174]: multipathd
Nov 24 14:23:00 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 24 14:23:00 compute-0 systemd[1]: Stopped multipathd container.
Nov 24 14:23:00 compute-0 systemd[1]: Starting multipathd container...
Nov 24 14:23:00 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:23:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a32344f8e2fd53cb8d636575688a9eaef59fe67d2f1fa77c042839e0429b41/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 14:23:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a32344f8e2fd53cb8d636575688a9eaef59fe67d2f1fa77c042839e0429b41/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 14:23:00 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.
Nov 24 14:23:00 compute-0 podman[170187]: 2025-11-24 14:23:00.861809769 +0000 UTC m=+0.133206861 container init 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 14:23:00 compute-0 multipathd[170200]: + sudo -E kolla_set_configs
Nov 24 14:23:00 compute-0 sudo[170209]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 14:23:00 compute-0 sudo[170209]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:23:00 compute-0 sudo[170209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 14:23:00 compute-0 podman[170187]: 2025-11-24 14:23:00.901741059 +0000 UTC m=+0.173138141 container start 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 14:23:00 compute-0 podman[170187]: multipathd
Nov 24 14:23:00 compute-0 systemd[1]: Started multipathd container.
Nov 24 14:23:00 compute-0 sudo[170138]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:00 compute-0 multipathd[170200]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:23:00 compute-0 multipathd[170200]: INFO:__main__:Validating config file
Nov 24 14:23:00 compute-0 multipathd[170200]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:23:00 compute-0 multipathd[170200]: INFO:__main__:Writing out command to execute
Nov 24 14:23:00 compute-0 sudo[170209]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:00 compute-0 multipathd[170200]: ++ cat /run_command
Nov 24 14:23:00 compute-0 podman[170210]: 2025-11-24 14:23:00.989561687 +0000 UTC m=+0.073899559 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd)
Nov 24 14:23:00 compute-0 multipathd[170200]: + CMD='/usr/sbin/multipathd -d'
Nov 24 14:23:00 compute-0 multipathd[170200]: + ARGS=
Nov 24 14:23:00 compute-0 multipathd[170200]: + sudo kolla_copy_cacerts
Nov 24 14:23:00 compute-0 systemd[1]: 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c-24ca6af124ca71d1.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:23:00 compute-0 systemd[1]: 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c-24ca6af124ca71d1.service: Failed with result 'exit-code'.
Nov 24 14:23:01 compute-0 sudo[170238]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 14:23:01 compute-0 sudo[170238]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:23:01 compute-0 sudo[170238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Nov 24 14:23:01 compute-0 sudo[170238]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:01 compute-0 multipathd[170200]: + [[ ! -n '' ]]
Nov 24 14:23:01 compute-0 multipathd[170200]: + . kolla_extend_start
Nov 24 14:23:01 compute-0 multipathd[170200]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 24 14:23:01 compute-0 multipathd[170200]: Running command: '/usr/sbin/multipathd -d'
Nov 24 14:23:01 compute-0 multipathd[170200]: + umask 0022
Nov 24 14:23:01 compute-0 multipathd[170200]: + exec /usr/sbin/multipathd -d
Nov 24 14:23:01 compute-0 multipathd[170200]: 2448.521383 | --------start up--------
Nov 24 14:23:01 compute-0 multipathd[170200]: 2448.521407 | read /etc/multipath.conf
Nov 24 14:23:01 compute-0 multipathd[170200]: 2448.528246 | path checkers start up
Nov 24 14:23:01 compute-0 sudo[170392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htcnaqcufgafqupqhkacpwhixwdshuua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994181.2180433-602-148092495965813/AnsiballZ_file.py'
Nov 24 14:23:01 compute-0 sudo[170392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:01 compute-0 python3.9[170394]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:01 compute-0 sudo[170392]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:02 compute-0 sudo[170544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypzjxfdbczjgadsbmalpxmugnfhxnekv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994182.188663-614-206416863798073/AnsiballZ_file.py'
Nov 24 14:23:02 compute-0 sudo[170544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:02 compute-0 python3.9[170546]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 24 14:23:02 compute-0 sudo[170544]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:03 compute-0 sudo[170696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwunypwqujjxzmmatjqyjizkntaxivjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994182.7911806-622-204600895762677/AnsiballZ_modprobe.py'
Nov 24 14:23:03 compute-0 sudo[170696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:03 compute-0 python3.9[170698]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 24 14:23:03 compute-0 kernel: Key type psk registered
Nov 24 14:23:03 compute-0 sudo[170696]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:03 compute-0 sudo[170858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpgaebzkzszjakluqmjahagnlmoxpwqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994183.4555213-630-137323019190278/AnsiballZ_stat.py'
Nov 24 14:23:03 compute-0 sudo[170858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:03 compute-0 python3.9[170860]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:23:03 compute-0 sudo[170858]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:04 compute-0 sudo[170981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfveobskgeuwscdnepufjincqglefmog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994183.4555213-630-137323019190278/AnsiballZ_copy.py'
Nov 24 14:23:04 compute-0 sudo[170981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:04 compute-0 python3.9[170983]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994183.4555213-630-137323019190278/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:04 compute-0 sudo[170981]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:04 compute-0 sudo[171133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntxrgnhugsebysxinegqsaougftqzrcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994184.6882355-646-246934074916550/AnsiballZ_lineinfile.py'
Nov 24 14:23:04 compute-0 sudo[171133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:05 compute-0 python3.9[171135]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:05 compute-0 sudo[171133]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:05 compute-0 sudo[171285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqndukiukqbuyuotbnyxbadozotdjjqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994185.379238-654-71857048164465/AnsiballZ_systemd.py'
Nov 24 14:23:05 compute-0 sudo[171285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:06 compute-0 python3.9[171287]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:23:06 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 24 14:23:06 compute-0 systemd[1]: Stopped Load Kernel Modules.
Nov 24 14:23:06 compute-0 systemd[1]: Stopping Load Kernel Modules...
Nov 24 14:23:06 compute-0 systemd[1]: Starting Load Kernel Modules...
Nov 24 14:23:06 compute-0 systemd[1]: Finished Load Kernel Modules.
Nov 24 14:23:06 compute-0 sudo[171285]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:06 compute-0 sudo[171441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwcovzvuezzycafxxktopubjefjguowg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994186.4743102-662-134974775161306/AnsiballZ_dnf.py'
Nov 24 14:23:06 compute-0 sudo[171441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:07 compute-0 python3.9[171443]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 24 14:23:09 compute-0 podman[171448]: 2025-11-24 14:23:09.479061911 +0000 UTC m=+0.088354012 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:23:09 compute-0 systemd[1]: Reloading.
Nov 24 14:23:09 compute-0 systemd-rc-local-generator[171501]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:23:09 compute-0 systemd-sysv-generator[171506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:23:10 compute-0 systemd[1]: Reloading.
Nov 24 14:23:10 compute-0 systemd-rc-local-generator[171537]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:23:10 compute-0 systemd-sysv-generator[171540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:23:10 compute-0 virtproxyd[153264]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 24 14:23:10 compute-0 virtproxyd[153264]: hostname: compute-0
Nov 24 14:23:10 compute-0 virtproxyd[153264]: nl_recv returned with error: No buffer space available
Nov 24 14:23:10 compute-0 systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 24 14:23:10 compute-0 systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 24 14:23:10 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 24 14:23:10 compute-0 systemd[1]: Starting man-db-cache-update.service...
Nov 24 14:23:10 compute-0 systemd[1]: Reloading.
Nov 24 14:23:10 compute-0 systemd-rc-local-generator[171629]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:23:10 compute-0 systemd-sysv-generator[171633]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:23:10 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 24 14:23:11 compute-0 sudo[171441]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:11 compute-0 sudo[172919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsoueulgirwttpgmieekzhfqkmzpugxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994191.6495574-670-278087871667112/AnsiballZ_systemd_service.py'
Nov 24 14:23:11 compute-0 sudo[172919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:12 compute-0 python3.9[172921]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:23:12 compute-0 iscsid[161294]: iscsid shutting down.
Nov 24 14:23:12 compute-0 systemd[1]: Stopping Open-iSCSI...
Nov 24 14:23:12 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Nov 24 14:23:12 compute-0 systemd[1]: Stopped Open-iSCSI.
Nov 24 14:23:12 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 24 14:23:12 compute-0 systemd[1]: Starting Open-iSCSI...
Nov 24 14:23:12 compute-0 systemd[1]: Started Open-iSCSI.
Nov 24 14:23:12 compute-0 sudo[172919]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:12 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 24 14:23:12 compute-0 systemd[1]: Finished man-db-cache-update.service.
Nov 24 14:23:12 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.420s CPU time.
Nov 24 14:23:12 compute-0 systemd[1]: run-r7508f63f4bf94cbdb37452893c208bd8.service: Deactivated successfully.
Nov 24 14:23:13 compute-0 python3.9[173076]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:23:13 compute-0 sudo[173230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkrcivaeosvolxdpmzqckkblgokntqiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994193.5209155-688-189653094015562/AnsiballZ_file.py'
Nov 24 14:23:13 compute-0 sudo[173230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:13 compute-0 python3.9[173232]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:14 compute-0 sudo[173230]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:14 compute-0 sudo[173382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuqbkakdqdyymnldqzjuzhmsfbfyksmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994194.3957448-699-271424259842173/AnsiballZ_systemd_service.py'
Nov 24 14:23:14 compute-0 sudo[173382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:15 compute-0 python3.9[173384]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:23:15 compute-0 systemd[1]: Reloading.
Nov 24 14:23:15 compute-0 systemd-rc-local-generator[173411]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:23:15 compute-0 systemd-sysv-generator[173414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:23:15 compute-0 sudo[173382]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:15 compute-0 python3.9[173568]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:23:16 compute-0 network[173585]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:23:16 compute-0 network[173586]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:23:16 compute-0 network[173587]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:23:22 compute-0 sudo[173859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpfaavbtrdekvxyocwnpajadewxtcgiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994201.7063274-718-64186563596810/AnsiballZ_systemd_service.py'
Nov 24 14:23:22 compute-0 sudo[173859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:22 compute-0 python3.9[173861]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:22 compute-0 sudo[173859]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:22 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 24 14:23:22 compute-0 sudo[174013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrcejcpnvprpyvndfvbdwynjuxnwiqpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994202.659048-718-160339665466437/AnsiballZ_systemd_service.py'
Nov 24 14:23:22 compute-0 sudo[174013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:23 compute-0 python3.9[174015]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:23 compute-0 sudo[174013]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:23 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 24 14:23:23 compute-0 sudo[174167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eubtgifzejvsehztlmpfukndhdnbxued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994203.469049-718-143987379216018/AnsiballZ_systemd_service.py'
Nov 24 14:23:23 compute-0 sudo[174167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:24 compute-0 python3.9[174169]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:24 compute-0 sudo[174167]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:24 compute-0 sudo[174320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnmdojwxhbfnvlqgpqvasqomtwhtzomo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994204.2158842-718-114099357081266/AnsiballZ_systemd_service.py'
Nov 24 14:23:24 compute-0 sudo[174320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:24 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 24 14:23:24 compute-0 python3.9[174322]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:24 compute-0 sudo[174320]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:25 compute-0 sudo[174474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjkhchesefwtlzssdcckhxqlbmdhvgrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994205.0370524-718-271124847443769/AnsiballZ_systemd_service.py'
Nov 24 14:23:25 compute-0 sudo[174474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:25 compute-0 python3.9[174476]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:25 compute-0 sudo[174474]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:25 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 24 14:23:26 compute-0 sudo[174628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kiacmrwvuixqeepyhickugwkmzowrvdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994205.781674-718-99012647302830/AnsiballZ_systemd_service.py'
Nov 24 14:23:26 compute-0 sudo[174628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:26 compute-0 python3.9[174630]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:26 compute-0 sudo[174628]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:26 compute-0 sudo[174781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrjxgliajhaepipykupfmouwwxcdweqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994206.6160493-718-224192884464337/AnsiballZ_systemd_service.py'
Nov 24 14:23:26 compute-0 sudo[174781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:27 compute-0 python3.9[174783]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:27 compute-0 sudo[174781]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:27 compute-0 sudo[174934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aathbvwstryytnliaosibalslpzrnkbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994207.4859493-718-257945308811803/AnsiballZ_systemd_service.py'
Nov 24 14:23:27 compute-0 sudo[174934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:28 compute-0 python3.9[174936]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:23:28 compute-0 sudo[174934]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:29 compute-0 sudo[175100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nefbfydxarfzeseaaeavclbmgoxzebfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994208.6331913-777-178102058234539/AnsiballZ_file.py'
Nov 24 14:23:29 compute-0 sudo[175100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:29 compute-0 podman[175061]: 2025-11-24 14:23:29.036582126 +0000 UTC m=+0.088265179 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 24 14:23:29 compute-0 python3.9[175108]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:29 compute-0 sudo[175100]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:29 compute-0 sudo[175259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlvhrnrcapggqydoiabnxikvghitybvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994209.3774064-777-239549470457713/AnsiballZ_file.py'
Nov 24 14:23:29 compute-0 sudo[175259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:29 compute-0 python3.9[175261]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:29 compute-0 sudo[175259]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:30 compute-0 sudo[175411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phuyggkuipqhzrskdydmexvddayynvkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994209.958594-777-26011502996145/AnsiballZ_file.py'
Nov 24 14:23:30 compute-0 sudo[175411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:30 compute-0 python3.9[175413]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:30 compute-0 sudo[175411]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:30 compute-0 sudo[175563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjbmfhnjguzedzswjqabnlpchyoyklvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994210.5297282-777-269568896502229/AnsiballZ_file.py'
Nov 24 14:23:30 compute-0 sudo[175563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:30 compute-0 python3.9[175565]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:31 compute-0 sudo[175563]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:31 compute-0 sudo[175731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuhyoajuypohyuznolbhplslhrxblqew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994211.1502135-777-198074246010664/AnsiballZ_file.py'
Nov 24 14:23:31 compute-0 sudo[175731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:31 compute-0 podman[175689]: 2025-11-24 14:23:31.415502399 +0000 UTC m=+0.054498581 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:23:31 compute-0 python3.9[175737]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:31 compute-0 sudo[175731]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:31 compute-0 sudo[175888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zafxmtvorunhpfjfkpmvuxhjphzjefcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994211.7092013-777-162882778599743/AnsiballZ_file.py'
Nov 24 14:23:31 compute-0 sudo[175888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:32 compute-0 python3.9[175890]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:32 compute-0 sudo[175888]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:32 compute-0 sudo[176040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbmfqycqgubdieapypuvgvuwhrvtyaqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994212.272022-777-111026985224436/AnsiballZ_file.py'
Nov 24 14:23:32 compute-0 sudo[176040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:32 compute-0 python3.9[176042]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:32 compute-0 sudo[176040]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:33 compute-0 sudo[176192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymwzampifsrkylfwrrtglhwrhaucuiqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994212.8374374-777-47795073526801/AnsiballZ_file.py'
Nov 24 14:23:33 compute-0 sudo[176192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:33 compute-0 python3.9[176194]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:33 compute-0 sudo[176192]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:33 compute-0 sudo[176344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkruplmlvzkfsesduylgshkvrwkkbgku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994213.451508-834-158941600813400/AnsiballZ_file.py'
Nov 24 14:23:33 compute-0 sudo[176344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:33 compute-0 python3.9[176346]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:33 compute-0 sudo[176344]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:34 compute-0 sudo[176496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lehbddkogmcexzgbjorvufsehyzwjsos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994213.9825082-834-188412289218665/AnsiballZ_file.py'
Nov 24 14:23:34 compute-0 sudo[176496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:34 compute-0 python3.9[176498]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:34 compute-0 sudo[176496]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:34 compute-0 sudo[176648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leenqiyrxvhytwdhqzwanjomqmitrebg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994214.48189-834-78140015548164/AnsiballZ_file.py'
Nov 24 14:23:34 compute-0 sudo[176648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:34 compute-0 python3.9[176650]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:34 compute-0 sudo[176648]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:35 compute-0 sudo[176800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aeameenxfacacsnxdbxmiaesevyhwrhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994215.034253-834-79482139526599/AnsiballZ_file.py'
Nov 24 14:23:35 compute-0 sudo[176800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:35 compute-0 python3.9[176802]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:35 compute-0 sudo[176800]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:35 compute-0 sudo[176952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxedwbzrwdwjbviesmjlssxdiavavbgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994215.5580537-834-231619131940857/AnsiballZ_file.py'
Nov 24 14:23:35 compute-0 sudo[176952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:36 compute-0 python3.9[176954]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:36 compute-0 sudo[176952]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:36 compute-0 sudo[177104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvpkonardbrfzdrmzlrbyctmcfuscqmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994216.1373067-834-199653866924522/AnsiballZ_file.py'
Nov 24 14:23:36 compute-0 sudo[177104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:36 compute-0 python3.9[177106]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:36 compute-0 sudo[177104]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:37 compute-0 sudo[177256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjlntadwpywyojseutfknuipbzecapsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994216.7976725-834-256651742164317/AnsiballZ_file.py'
Nov 24 14:23:37 compute-0 sudo[177256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:37 compute-0 python3.9[177258]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:37 compute-0 sudo[177256]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:37 compute-0 sudo[177408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcmnnvqbzvenmunzfwpfbyjczhhyhzwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994217.5142725-834-209765580317600/AnsiballZ_file.py'
Nov 24 14:23:37 compute-0 sudo[177408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:38 compute-0 python3.9[177410]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:23:38 compute-0 sudo[177408]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:38 compute-0 sudo[177560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaawrvtqmtlbfflutztvufufqgmsrycj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994218.334683-892-221215660912176/AnsiballZ_command.py'
Nov 24 14:23:38 compute-0 sudo[177560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:38 compute-0 python3.9[177562]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:38 compute-0 sudo[177560]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:39 compute-0 podman[177688]: 2025-11-24 14:23:39.662888388 +0000 UTC m=+0.094756564 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:23:39 compute-0 python3.9[177726]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 14:23:40 compute-0 sudo[177891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxyjlxbwncdcejsztpxftuvpobffsvhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994220.0440395-910-104363030643495/AnsiballZ_systemd_service.py'
Nov 24 14:23:40 compute-0 sudo[177891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:40 compute-0 python3.9[177893]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:23:40 compute-0 systemd[1]: Reloading.
Nov 24 14:23:40 compute-0 systemd-sysv-generator[177925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:23:40 compute-0 systemd-rc-local-generator[177922]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:23:41 compute-0 sudo[177891]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:41 compute-0 sudo[178079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myuggwoscpdhfnqyojfjpwddwvcpvaxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994221.1658537-918-114179555518639/AnsiballZ_command.py'
Nov 24 14:23:41 compute-0 sudo[178079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:41 compute-0 python3.9[178081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:41 compute-0 sudo[178079]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:42 compute-0 sudo[178232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxiwvtzhtyldchqcxmfyrzjczprujrxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994221.8073785-918-243368169890161/AnsiballZ_command.py'
Nov 24 14:23:42 compute-0 sudo[178232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:42 compute-0 python3.9[178234]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:42 compute-0 sudo[178232]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:42 compute-0 sudo[178385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llxqsntgljxnyyalmlvbwnkptykmbfap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994222.4150238-918-128576259835686/AnsiballZ_command.py'
Nov 24 14:23:42 compute-0 sudo[178385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:42 compute-0 python3.9[178387]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:42 compute-0 sudo[178385]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:43 compute-0 sudo[178538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmgyyycqnottaqzxaogltswalpqqjmeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994222.9427238-918-45870404817910/AnsiballZ_command.py'
Nov 24 14:23:43 compute-0 sudo[178538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:43 compute-0 python3.9[178540]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:43 compute-0 sudo[178538]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:43 compute-0 sudo[178691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbexetfjvpvputuhickuyeknooboqwii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994223.4859047-918-189142423882831/AnsiballZ_command.py'
Nov 24 14:23:43 compute-0 sudo[178691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:43 compute-0 python3.9[178693]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:44 compute-0 sudo[178691]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:44 compute-0 sudo[178844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytddhwoqaivsbvzkthcnatjqbwwfnedi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994224.138772-918-82453569637909/AnsiballZ_command.py'
Nov 24 14:23:44 compute-0 sudo[178844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:44 compute-0 python3.9[178846]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:44 compute-0 sudo[178844]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:44 compute-0 sudo[178997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puevrybycymvzxiukbtmwjnuxnvwkzey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994224.71814-918-65056867446522/AnsiballZ_command.py'
Nov 24 14:23:44 compute-0 sudo[178997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:45 compute-0 python3.9[178999]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:45 compute-0 sudo[178997]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:45 compute-0 sudo[179150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taxxqgmqugytjzryyloskbfbbnjnckig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994225.3384702-918-56082787206058/AnsiballZ_command.py'
Nov 24 14:23:45 compute-0 sudo[179150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:45 compute-0 python3.9[179152]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:23:45 compute-0 sudo[179150]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:47 compute-0 sudo[179303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elccwqutbufzdvutixukcjpwnainzchm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994226.8763804-997-100461915459133/AnsiballZ_file.py'
Nov 24 14:23:47 compute-0 sudo[179303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:47 compute-0 python3.9[179305]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:47 compute-0 sudo[179303]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:47 compute-0 sudo[179455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdorixyxkyuumqvohtkflouetnsquehv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994227.5036883-997-269453121767289/AnsiballZ_file.py'
Nov 24 14:23:47 compute-0 sudo[179455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:48 compute-0 python3.9[179457]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:48 compute-0 sudo[179455]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:48 compute-0 sudo[179607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxeegbmwmbveomodfdonhkndyefjital ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994228.2170572-997-102693138478566/AnsiballZ_file.py'
Nov 24 14:23:48 compute-0 sudo[179607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:48 compute-0 python3.9[179609]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:48 compute-0 sudo[179607]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:49 compute-0 sudo[179759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpidhaugwuyhrszfckbaayvdifhnxjvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994228.8482254-1019-64562222110239/AnsiballZ_file.py'
Nov 24 14:23:49 compute-0 sudo[179759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:49 compute-0 python3.9[179761]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:49 compute-0 sudo[179759]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:49 compute-0 sudo[179911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leanilnqcgxghkklbzjficqwuafnjuwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994229.4416084-1019-96031398800948/AnsiballZ_file.py'
Nov 24 14:23:49 compute-0 sudo[179911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:49 compute-0 python3.9[179913]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:49 compute-0 sudo[179911]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:50 compute-0 sudo[180063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzoyadqvabgtmuzfdrsdormvirscavdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994229.997083-1019-2776666325435/AnsiballZ_file.py'
Nov 24 14:23:50 compute-0 sudo[180063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:50 compute-0 python3.9[180065]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:50 compute-0 sudo[180063]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:50 compute-0 sudo[180215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpgilaunyuaiuvjxiatprwgieftnrxld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994230.6792555-1019-62593258166729/AnsiballZ_file.py'
Nov 24 14:23:50 compute-0 sudo[180215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:51 compute-0 python3.9[180217]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:51 compute-0 sudo[180215]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:51 compute-0 sudo[180367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnsuamipomfbusiupqqedpfzzqjlbvxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994231.2643745-1019-244745619973535/AnsiballZ_file.py'
Nov 24 14:23:51 compute-0 sudo[180367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:51 compute-0 python3.9[180369]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:51 compute-0 sudo[180367]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:52 compute-0 sudo[180519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bukvsxgqkhdvujtvbpamwzovsnfvkmil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994231.9561408-1019-247616040401728/AnsiballZ_file.py'
Nov 24 14:23:52 compute-0 sudo[180519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:52 compute-0 python3.9[180521]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:52 compute-0 sudo[180519]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:52 compute-0 sudo[180671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjhwuignqfhdtigddcimrwufnvfzhfjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994232.663011-1019-119981771610657/AnsiballZ_file.py'
Nov 24 14:23:52 compute-0 sudo[180671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:53 compute-0 python3.9[180673]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:23:53 compute-0 sudo[180671]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:23:56.648 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:23:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:23:56.649 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:23:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:23:56.649 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:23:57 compute-0 sudo[180823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwmldmujdkyizrtmqzrgolfgdupjiei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994237.036727-1188-262098138801674/AnsiballZ_getent.py'
Nov 24 14:23:57 compute-0 sudo[180823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:57 compute-0 python3.9[180825]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 24 14:23:57 compute-0 sudo[180823]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:58 compute-0 sudo[180976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkqjfxbxkuyyzorxpstwenvxhgvhmsry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994237.806072-1196-151430275430026/AnsiballZ_group.py'
Nov 24 14:23:58 compute-0 sudo[180976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:58 compute-0 python3.9[180978]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 14:23:58 compute-0 groupadd[180979]: group added to /etc/group: name=nova, GID=42436
Nov 24 14:23:58 compute-0 groupadd[180979]: group added to /etc/gshadow: name=nova
Nov 24 14:23:58 compute-0 groupadd[180979]: new group: name=nova, GID=42436
Nov 24 14:23:58 compute-0 sudo[180976]: pam_unix(sudo:session): session closed for user root
Nov 24 14:23:59 compute-0 sudo[181144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pikskboosyalvcnhchmeolrpbwltouiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994238.6229806-1204-110047765743207/AnsiballZ_user.py'
Nov 24 14:23:59 compute-0 sudo[181144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:23:59 compute-0 podman[181108]: 2025-11-24 14:23:59.166498238 +0000 UTC m=+0.075394438 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 14:23:59 compute-0 python3.9[181150]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 14:23:59 compute-0 useradd[181154]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Nov 24 14:23:59 compute-0 useradd[181154]: add 'nova' to group 'libvirt'
Nov 24 14:23:59 compute-0 useradd[181154]: add 'nova' to shadow group 'libvirt'
Nov 24 14:23:59 compute-0 sudo[181144]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:00 compute-0 sshd-session[181185]: Accepted publickey for zuul from 192.168.122.30 port 34796 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:24:00 compute-0 systemd-logind[807]: New session 25 of user zuul.
Nov 24 14:24:00 compute-0 systemd[1]: Started Session 25 of User zuul.
Nov 24 14:24:00 compute-0 sshd-session[181185]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:24:00 compute-0 sshd-session[181188]: Received disconnect from 192.168.122.30 port 34796:11: disconnected by user
Nov 24 14:24:00 compute-0 sshd-session[181188]: Disconnected from user zuul 192.168.122.30 port 34796
Nov 24 14:24:00 compute-0 sshd-session[181185]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:24:00 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Nov 24 14:24:00 compute-0 systemd-logind[807]: Session 25 logged out. Waiting for processes to exit.
Nov 24 14:24:00 compute-0 systemd-logind[807]: Removed session 25.
Nov 24 14:24:01 compute-0 python3.9[181338]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:01 compute-0 podman[181433]: 2025-11-24 14:24:01.59076265 +0000 UTC m=+0.068468379 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 14:24:01 compute-0 python3.9[181469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994240.7192295-1229-95010853907811/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:02 compute-0 python3.9[181630]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:02 compute-0 python3.9[181706]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:03 compute-0 python3.9[181856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:03 compute-0 python3.9[181977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994243.032324-1229-272231775116952/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:04 compute-0 python3.9[182127]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:05 compute-0 python3.9[182248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994244.1106706-1229-109259273091972/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:05 compute-0 python3.9[182398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:06 compute-0 python3.9[182519]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994245.1676946-1229-187376802810271/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:06 compute-0 python3.9[182669]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:07 compute-0 python3.9[182790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994246.3570278-1229-110745576629928/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:07 compute-0 sudo[182940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqsqefklvmyoirrtdjnotdbnubdhieez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994247.578416-1312-196227870515527/AnsiballZ_file.py'
Nov 24 14:24:07 compute-0 sudo[182940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:08 compute-0 python3.9[182942]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:08 compute-0 sudo[182940]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:08 compute-0 sudo[183092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijjqsstpiftmjtozmewhazayglbeskga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994248.2689729-1320-210401966867998/AnsiballZ_copy.py'
Nov 24 14:24:08 compute-0 sudo[183092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:08 compute-0 python3.9[183094]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:08 compute-0 sudo[183092]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:09 compute-0 sudo[183244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuhvhoksmuwzsxcdxkmmwdnayiudfvqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994248.8763006-1328-79312646609108/AnsiballZ_stat.py'
Nov 24 14:24:09 compute-0 sudo[183244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:09 compute-0 python3.9[183246]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:09 compute-0 sudo[183244]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:09 compute-0 sudo[183404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykjudgvbyxlixmsfssjelumumgppgoig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994249.5677567-1336-113396041530829/AnsiballZ_stat.py'
Nov 24 14:24:09 compute-0 sudo[183404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:09 compute-0 podman[183370]: 2025-11-24 14:24:09.930426274 +0000 UTC m=+0.109118174 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:24:10 compute-0 python3.9[183409]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:10 compute-0 sudo[183404]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:10 compute-0 sudo[183543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jffqgdpylsofsquzxmemgqhieyomdliz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994249.5677567-1336-113396041530829/AnsiballZ_copy.py'
Nov 24 14:24:10 compute-0 sudo[183543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:10 compute-0 python3.9[183545]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1763994249.5677567-1336-113396041530829/.source _original_basename=.t7vti3b1 follow=False checksum=dbea00efeb760a4ce71ea6e45abfa6ffc102dade backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 24 14:24:10 compute-0 sudo[183543]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:11 compute-0 python3.9[183697]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:12 compute-0 python3.9[183849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:12 compute-0 python3.9[183970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994251.7044384-1362-152092657997668/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=4c77b2c041a7564aa2c84115117dc8517e9bb9ef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:13 compute-0 python3.9[184120]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:13 compute-0 python3.9[184241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994252.8804517-1377-164666534485610/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=941d5739094d046b86479403aeaaf0441b82ba11 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:14 compute-0 sudo[184391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sobaiepgywwptxfradcgqhthaitxenqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994254.2145462-1394-118919718829455/AnsiballZ_container_config_data.py'
Nov 24 14:24:14 compute-0 sudo[184391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:14 compute-0 python3.9[184393]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 24 14:24:14 compute-0 sudo[184391]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:15 compute-0 sudo[184543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmifgukvdtbxqryjhuueahydiavcyfqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994254.8128934-1403-61805302370190/AnsiballZ_container_config_hash.py'
Nov 24 14:24:15 compute-0 sudo[184543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:15 compute-0 python3.9[184545]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:24:15 compute-0 sudo[184543]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:15 compute-0 sudo[184695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzhcgjkjvejlewehvfyglqriukzrnqbp ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994255.5617237-1413-113968082695176/AnsiballZ_edpm_container_manage.py'
Nov 24 14:24:15 compute-0 sudo[184695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:16 compute-0 python3[184697]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:24:16 compute-0 podman[184733]: 2025-11-24 14:24:16.323084374 +0000 UTC m=+0.044739626 container create 6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:24:16 compute-0 podman[184733]: 2025-11-24 14:24:16.29974603 +0000 UTC m=+0.021401302 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076
Nov 24 14:24:16 compute-0 python3[184697]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 24 14:24:16 compute-0 sudo[184695]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:16 compute-0 sudo[184921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-locqgqwpdhrrnzuyiaqtzxaaozhtpoxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994256.6529274-1421-244288816307377/AnsiballZ_stat.py'
Nov 24 14:24:16 compute-0 sudo[184921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:17 compute-0 python3.9[184923]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:17 compute-0 sudo[184921]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:17 compute-0 sudo[185075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snjcnlclmkgrwkwlneconzlxjlgbldvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994257.576605-1433-131741796354531/AnsiballZ_container_config_data.py'
Nov 24 14:24:17 compute-0 sudo[185075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:18 compute-0 python3.9[185077]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 24 14:24:18 compute-0 sudo[185075]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:18 compute-0 sudo[185227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozdelnleiznuzxgpzlcalysfyrvielwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994258.3421655-1442-186820619030993/AnsiballZ_container_config_hash.py'
Nov 24 14:24:18 compute-0 sudo[185227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:18 compute-0 python3.9[185229]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:24:18 compute-0 sudo[185227]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:19 compute-0 sudo[185379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdgjgevlaprykktrlyvdwahvxtvxluy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994259.1324432-1452-205674025045093/AnsiballZ_edpm_container_manage.py'
Nov 24 14:24:19 compute-0 sudo[185379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:19 compute-0 python3[185381]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:24:19 compute-0 podman[185417]: 2025-11-24 14:24:19.842338876 +0000 UTC m=+0.039882933 container create c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:24:19 compute-0 podman[185417]: 2025-11-24 14:24:19.821884631 +0000 UTC m=+0.019428708 image pull 8e31b7b83c8d26bacd9598fdae1b287d27f8fa7d1d3cf4270dd8e435ff2f6a66 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076
Nov 24 14:24:19 compute-0 python3[185381]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076 kolla_start
Nov 24 14:24:19 compute-0 sudo[185379]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:20 compute-0 sudo[185605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izznqhzfzrkqcqgeatsdlwvzothicykv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994260.1463447-1460-264277850638373/AnsiballZ_stat.py'
Nov 24 14:24:20 compute-0 sudo[185605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:20 compute-0 python3.9[185607]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:20 compute-0 sudo[185605]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:21 compute-0 sudo[185759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyvpuupdsjewocwxfbuwhizjptokszaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994261.0442033-1469-80643099079164/AnsiballZ_file.py'
Nov 24 14:24:21 compute-0 sudo[185759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:21 compute-0 python3.9[185761]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:21 compute-0 sudo[185759]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:22 compute-0 sudo[185910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwiwdbpngithwhqoexcbrhupufkpekg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994261.6257336-1469-75720943647765/AnsiballZ_copy.py'
Nov 24 14:24:22 compute-0 sudo[185910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:22 compute-0 python3.9[185912]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763994261.6257336-1469-75720943647765/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:22 compute-0 sudo[185910]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:22 compute-0 sudo[185986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onjbizptxatquksdvbdzrqqquulgqjjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994261.6257336-1469-75720943647765/AnsiballZ_systemd.py'
Nov 24 14:24:22 compute-0 sudo[185986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:22 compute-0 python3.9[185988]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:24:22 compute-0 systemd[1]: Reloading.
Nov 24 14:24:22 compute-0 systemd-rc-local-generator[186017]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:24:23 compute-0 systemd-sysv-generator[186022]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:24:23 compute-0 sudo[185986]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:23 compute-0 sudo[186097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wynmpbzosczdtktfaqiowgsmkalrzpkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994261.6257336-1469-75720943647765/AnsiballZ_systemd.py'
Nov 24 14:24:23 compute-0 sudo[186097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:23 compute-0 python3.9[186099]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:24:23 compute-0 systemd[1]: Reloading.
Nov 24 14:24:23 compute-0 systemd-rc-local-generator[186124]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:24:23 compute-0 systemd-sysv-generator[186131]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:24:24 compute-0 systemd[1]: Starting nova_compute container...
Nov 24 14:24:24 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:24:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:24 compute-0 podman[186139]: 2025-11-24 14:24:24.188252865 +0000 UTC m=+0.115652121 container init c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:24:24 compute-0 podman[186139]: 2025-11-24 14:24:24.194117144 +0000 UTC m=+0.121516390 container start c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:24:24 compute-0 podman[186139]: nova_compute
Nov 24 14:24:24 compute-0 nova_compute[186155]: + sudo -E kolla_set_configs
Nov 24 14:24:24 compute-0 systemd[1]: Started nova_compute container.
Nov 24 14:24:24 compute-0 sudo[186097]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Validating config file
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying service configuration files
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Deleting /etc/ceph
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Creating directory /etc/ceph
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /etc/ceph
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Writing out command to execute
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:24 compute-0 nova_compute[186155]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 14:24:24 compute-0 nova_compute[186155]: ++ cat /run_command
Nov 24 14:24:24 compute-0 nova_compute[186155]: + CMD=nova-compute
Nov 24 14:24:24 compute-0 nova_compute[186155]: + ARGS=
Nov 24 14:24:24 compute-0 nova_compute[186155]: + sudo kolla_copy_cacerts
Nov 24 14:24:24 compute-0 nova_compute[186155]: + [[ ! -n '' ]]
Nov 24 14:24:24 compute-0 nova_compute[186155]: + . kolla_extend_start
Nov 24 14:24:24 compute-0 nova_compute[186155]: + echo 'Running command: '\''nova-compute'\'''
Nov 24 14:24:24 compute-0 nova_compute[186155]: Running command: 'nova-compute'
Nov 24 14:24:24 compute-0 nova_compute[186155]: + umask 0022
Nov 24 14:24:24 compute-0 nova_compute[186155]: + exec nova-compute
Nov 24 14:24:25 compute-0 python3.9[186317]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:25 compute-0 python3.9[186467]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.210 186159 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.211 186159 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.211 186159 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.211 186159 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.337 186159 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.362 186159 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.362 186159 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 24 14:24:26 compute-0 python3.9[186621]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:26 compute-0 nova_compute[186155]: 2025-11-24 14:24:26.970 186159 INFO nova.virt.driver [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.098 186159 INFO nova.compute.provider_config [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.111 186159 DEBUG oslo_concurrency.lockutils [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.112 186159 DEBUG oslo_concurrency.lockutils [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.112 186159 DEBUG oslo_concurrency.lockutils [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.112 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.112 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.113 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.114 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.115 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.116 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.116 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.116 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.116 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.116 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.116 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.117 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.118 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.118 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.118 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.118 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.118 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.118 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.119 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.119 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.119 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.119 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.119 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.119 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.120 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.121 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.122 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.123 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.124 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.124 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.124 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.124 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.124 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.124 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.125 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.126 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.126 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.126 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.126 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.126 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.126 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.127 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.127 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.127 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.127 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.127 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.127 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.128 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.128 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.128 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.128 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.128 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.128 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.129 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.130 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.131 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.131 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.131 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.131 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.131 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.131 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.132 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.132 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.132 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.132 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.132 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.132 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.133 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.133 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.133 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.133 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.133 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.133 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.134 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.134 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.134 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.134 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.134 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.134 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.135 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.135 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.135 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.135 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.135 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.136 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.136 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.136 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.136 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.136 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.137 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.137 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.137 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.137 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.137 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.138 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.138 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.138 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.138 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.138 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.139 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.139 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.139 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.139 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.140 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.140 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.140 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.140 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.140 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.141 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.141 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.141 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.141 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.141 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.142 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.142 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.142 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.142 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.142 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.142 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.143 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.144 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.144 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.144 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.144 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.144 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.144 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.145 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.146 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.147 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.148 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.148 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.148 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.148 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.148 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.149 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.150 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.150 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.150 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.150 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.150 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.150 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.151 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.151 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.151 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.151 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.151 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.151 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.152 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.152 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.152 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.152 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.152 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.152 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.153 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.154 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.154 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.154 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.154 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.154 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.154 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.155 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.156 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.157 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.157 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.157 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.157 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.157 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.157 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.158 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.159 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.159 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.159 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.159 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.159 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.159 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.160 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.160 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.160 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.160 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.160 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.160 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.161 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.161 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.161 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.161 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.161 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.161 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.162 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.163 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.163 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.163 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.163 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.163 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.163 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.164 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.164 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.164 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.164 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.164 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.165 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.165 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.165 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.165 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.165 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.166 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.166 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.166 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.166 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.166 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.166 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.167 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.168 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.168 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.168 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.168 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.168 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.168 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.169 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.169 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.169 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.169 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.169 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.169 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.170 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.171 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.172 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.173 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.174 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.174 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.174 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.174 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.174 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.174 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.175 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.175 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.175 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.175 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.175 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.176 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.176 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.176 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.176 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.176 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.176 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.177 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.178 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.179 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.180 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.180 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.180 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.180 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.180 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.180 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.181 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.181 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.181 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.181 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.181 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.181 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.182 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.183 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.184 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.184 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.184 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.184 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.184 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.185 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.186 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.187 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.188 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.188 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.188 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.188 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.188 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.188 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.189 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.190 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.190 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.190 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.190 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.190 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.190 186159 WARNING oslo_config.cfg [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 24 14:24:27 compute-0 nova_compute[186155]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 24 14:24:27 compute-0 nova_compute[186155]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 24 14:24:27 compute-0 nova_compute[186155]: and ``live_migration_inbound_addr`` respectively.
Nov 24 14:24:27 compute-0 nova_compute[186155]: ).  Its value may be silently ignored in the future.
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.191 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.191 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.191 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.191 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.191 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.191 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.192 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.192 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.192 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.192 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.192 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.192 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.193 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.194 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.195 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.195 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.195 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.195 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.195 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.195 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.196 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.197 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.197 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.197 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.197 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.197 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.197 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.198 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.199 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.199 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.199 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.199 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.199 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.199 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.200 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.201 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.202 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.203 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.204 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.204 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.204 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.204 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.204 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.204 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.205 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.206 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.207 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.207 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.207 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.207 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.207 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.207 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.208 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.208 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.208 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.208 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.208 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.209 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.209 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.209 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.209 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.209 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.209 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.210 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.210 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.210 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.210 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.210 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.211 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.212 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.212 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.212 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.212 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.212 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.212 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.213 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.214 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.215 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.215 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.215 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.215 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.215 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.215 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.216 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.217 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.218 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.219 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.219 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.219 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.219 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.219 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.219 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.220 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.220 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.220 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.220 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.220 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.220 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.221 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.221 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.221 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.221 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.221 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.222 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.223 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.224 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.225 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.226 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.227 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.228 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.228 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.228 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.228 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.228 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.228 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.229 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.229 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.229 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.229 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.229 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.229 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.230 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.231 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.231 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.231 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.231 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.231 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.231 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.232 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.233 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.233 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.233 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.233 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.233 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.233 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.234 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.235 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.235 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.235 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.235 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.235 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.235 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.236 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.237 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.237 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.237 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.237 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.237 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.237 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.238 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.239 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.240 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.240 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.240 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.240 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.240 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.240 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.241 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.242 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.243 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.244 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.244 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.244 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.244 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.244 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.244 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.245 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.246 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.247 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.248 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.248 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.248 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.248 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.248 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.248 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.249 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.250 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.251 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.252 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.252 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.252 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.252 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.252 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.252 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.253 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.254 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.255 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.255 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.255 186159 DEBUG oslo_service.service [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.256 186159 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.266 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.266 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.267 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.267 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 24 14:24:27 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Nov 24 14:24:27 compute-0 systemd[1]: Started libvirt QEMU daemon.
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.332 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f92c28051f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.334 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f92c28051f0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.335 186159 INFO nova.virt.libvirt.driver [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Connection event '1' reason 'None'
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.356 186159 WARNING nova.virt.libvirt.driver [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 24 14:24:27 compute-0 nova_compute[186155]: 2025-11-24 14:24:27.356 186159 DEBUG nova.virt.libvirt.volume.mount [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 24 14:24:27 compute-0 sudo[186823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfuivsfitioeylqdusznmrbjyorlzyld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994266.9375286-1529-15586818819760/AnsiballZ_podman_container.py'
Nov 24 14:24:27 compute-0 sudo[186823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:27 compute-0 python3.9[186825]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 24 14:24:27 compute-0 sudo[186823]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:27 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:24:27 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:24:28 compute-0 sshd[128788]: Timeout before authentication for connection from 71.32.48.202 to 38.102.83.214, pid = 163185
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.134 186159 INFO nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Libvirt host capabilities <capabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]: 
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <host>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <uuid>29821a9d-05ed-4e9e-b48f-8dca86832284</uuid>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <arch>x86_64</arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model>EPYC-Rome-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <vendor>AMD</vendor>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <microcode version='16777317'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <signature family='23' model='49' stepping='0'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='x2apic'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='tsc-deadline'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='osxsave'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='hypervisor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='tsc_adjust'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='spec-ctrl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='stibp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='arch-capabilities'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='cmp_legacy'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='topoext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='virt-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='lbrv'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='tsc-scale'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='vmcb-clean'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='pause-filter'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='pfthreshold'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='svme-addr-chk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='rdctl-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='skip-l1dfl-vmentry'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='mds-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature name='pschange-mc-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <pages unit='KiB' size='4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <pages unit='KiB' size='2048'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <pages unit='KiB' size='1048576'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <power_management>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <suspend_mem/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <suspend_disk/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <suspend_hybrid/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </power_management>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <iommu support='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <migration_features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <live/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <uri_transports>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <uri_transport>tcp</uri_transport>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <uri_transport>rdma</uri_transport>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </uri_transports>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </migration_features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <topology>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <cells num='1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <cell id='0'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           <memory unit='KiB'>7864324</memory>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           <pages unit='KiB' size='4'>1966081</pages>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           <pages unit='KiB' size='2048'>0</pages>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           <distances>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <sibling id='0' value='10'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           </distances>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           <cpus num='8'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:           </cpus>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         </cell>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </cells>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </topology>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <cache>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </cache>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <secmodel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model>selinux</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <doi>0</doi>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </secmodel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <secmodel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model>dac</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <doi>0</doi>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </secmodel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </host>
Nov 24 14:24:28 compute-0 nova_compute[186155]: 
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <guest>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <os_type>hvm</os_type>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <arch name='i686'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <wordsize>32</wordsize>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <domain type='qemu'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <domain type='kvm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <pae/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <nonpae/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <acpi default='on' toggle='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <apic default='on' toggle='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <cpuselection/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <deviceboot/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <disksnapshot default='on' toggle='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <externalSnapshot/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </guest>
Nov 24 14:24:28 compute-0 nova_compute[186155]: 
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <guest>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <os_type>hvm</os_type>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <arch name='x86_64'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <wordsize>64</wordsize>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <domain type='qemu'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <domain type='kvm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <acpi default='on' toggle='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <apic default='on' toggle='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <cpuselection/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <deviceboot/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <disksnapshot default='on' toggle='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <externalSnapshot/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </guest>
Nov 24 14:24:28 compute-0 nova_compute[186155]: 
Nov 24 14:24:28 compute-0 nova_compute[186155]: </capabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]: 
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.141 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.159 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 24 14:24:28 compute-0 nova_compute[186155]: <domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <domain>kvm</domain>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <arch>i686</arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <vcpu max='4096'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <iothreads supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <os supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='firmware'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <loader supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>rom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pflash</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='readonly'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>yes</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='secure'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </loader>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </os>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='maximumMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <vendor>AMD</vendor>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='succor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='custom' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-128'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-256'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-512'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <memoryBacking supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='sourceType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>anonymous</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>memfd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </memoryBacking>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <disk supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='diskDevice'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>disk</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cdrom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>floppy</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>lun</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>fdc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>sata</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </disk>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <graphics supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vnc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egl-headless</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </graphics>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <video supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='modelType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vga</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cirrus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>none</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>bochs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ramfb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </video>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hostdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='mode'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>subsystem</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='startupPolicy'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>mandatory</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>requisite</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>optional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='subsysType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pci</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='capsType'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='pciBackend'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hostdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <rng supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>random</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </rng>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <filesystem supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='driverType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>path</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>handle</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtiofs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </filesystem>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <tpm supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-tis</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-crb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emulator</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>external</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendVersion'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>2.0</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </tpm>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <redirdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </redirdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <channel supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </channel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <crypto supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </crypto>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <interface supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>passt</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </interface>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <panic supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>isa</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>hyperv</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </panic>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <console supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>null</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dev</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pipe</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stdio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>udp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tcp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu-vdagent</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </console>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <gic supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <genid supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backup supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <async-teardown supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <ps2 supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sev supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sgx supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hyperv supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='features'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>relaxed</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vapic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>spinlocks</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vpindex</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>runtime</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>synic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stimer</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reset</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vendor_id</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>frequencies</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reenlightenment</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tlbflush</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ipi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>avic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emsr_bitmap</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>xmm_input</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hyperv>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <launchSecurity supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='sectype'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tdx</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </launchSecurity>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </features>
Nov 24 14:24:28 compute-0 nova_compute[186155]: </domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.164 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 24 14:24:28 compute-0 nova_compute[186155]: <domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <domain>kvm</domain>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <arch>i686</arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <vcpu max='240'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <iothreads supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <os supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='firmware'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <loader supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>rom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pflash</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='readonly'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>yes</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='secure'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </loader>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </os>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='maximumMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <vendor>AMD</vendor>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='succor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='custom' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-128'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-256'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-512'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <memoryBacking supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='sourceType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>anonymous</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>memfd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </memoryBacking>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <disk supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='diskDevice'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>disk</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cdrom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>floppy</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>lun</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ide</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>fdc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>sata</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </disk>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <graphics supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vnc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egl-headless</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </graphics>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <video supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='modelType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vga</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cirrus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>none</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>bochs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ramfb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </video>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hostdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='mode'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>subsystem</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='startupPolicy'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>mandatory</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>requisite</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>optional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='subsysType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pci</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='capsType'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='pciBackend'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hostdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <rng supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>random</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </rng>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <filesystem supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='driverType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>path</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>handle</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtiofs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </filesystem>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <tpm supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-tis</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-crb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emulator</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>external</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendVersion'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>2.0</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </tpm>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <redirdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </redirdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <channel supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </channel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <crypto supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </crypto>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <interface supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>passt</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </interface>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <panic supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>isa</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>hyperv</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </panic>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <console supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>null</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dev</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pipe</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stdio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>udp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tcp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu-vdagent</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </console>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <gic supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <genid supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backup supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <async-teardown supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <ps2 supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sev supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sgx supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hyperv supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='features'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>relaxed</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vapic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>spinlocks</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vpindex</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>runtime</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>synic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stimer</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reset</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vendor_id</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>frequencies</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reenlightenment</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tlbflush</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ipi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>avic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emsr_bitmap</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>xmm_input</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hyperv>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <launchSecurity supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='sectype'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tdx</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </launchSecurity>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </features>
Nov 24 14:24:28 compute-0 nova_compute[186155]: </domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.199 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.202 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 24 14:24:28 compute-0 nova_compute[186155]: <domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <domain>kvm</domain>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <arch>x86_64</arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <vcpu max='240'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <iothreads supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <os supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='firmware'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <loader supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>rom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pflash</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='readonly'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>yes</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='secure'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </loader>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </os>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='maximumMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <vendor>AMD</vendor>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='succor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='custom' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-128'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-256'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-512'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <memoryBacking supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='sourceType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>anonymous</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>memfd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </memoryBacking>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <disk supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='diskDevice'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>disk</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cdrom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>floppy</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>lun</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ide</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>fdc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>sata</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </disk>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <graphics supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vnc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egl-headless</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </graphics>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <video supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='modelType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vga</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cirrus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>none</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>bochs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ramfb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </video>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hostdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='mode'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>subsystem</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='startupPolicy'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>mandatory</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>requisite</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>optional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='subsysType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pci</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='capsType'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='pciBackend'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hostdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <rng supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>random</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </rng>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <filesystem supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='driverType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>path</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>handle</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtiofs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </filesystem>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <tpm supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-tis</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-crb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emulator</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>external</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendVersion'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>2.0</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </tpm>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <redirdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </redirdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <channel supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </channel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <crypto supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </crypto>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <interface supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>passt</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </interface>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <panic supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>isa</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>hyperv</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </panic>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <console supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>null</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dev</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pipe</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stdio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>udp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tcp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu-vdagent</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </console>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <gic supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <genid supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backup supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <async-teardown supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <ps2 supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sev supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sgx supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hyperv supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='features'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>relaxed</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vapic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>spinlocks</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vpindex</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>runtime</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>synic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stimer</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reset</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vendor_id</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>frequencies</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reenlightenment</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tlbflush</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ipi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>avic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emsr_bitmap</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>xmm_input</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hyperv>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <launchSecurity supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='sectype'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tdx</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </launchSecurity>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </features>
Nov 24 14:24:28 compute-0 nova_compute[186155]: </domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.261 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 24 14:24:28 compute-0 nova_compute[186155]: <domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <domain>kvm</domain>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <arch>x86_64</arch>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <vcpu max='4096'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <iothreads supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <os supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='firmware'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>efi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <loader supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>rom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pflash</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='readonly'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>yes</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='secure'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>yes</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>no</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </loader>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </os>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='maximumMigratable'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>on</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>off</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <vendor>AMD</vendor>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='succor'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <mode name='custom' supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Denverton-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='auto-ibrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amd-psfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='stibp-always-on'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='EPYC-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-128'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-256'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx10-512'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='prefetchiti'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Haswell-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512er'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512pf'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fma4'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tbm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xop'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='amx-tile'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-bf16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-fp16'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bitalg'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrc'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fzrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='la57'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='taa-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xfd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ifma'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cmpccxadd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fbsdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='fsrs'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ibrs-all'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mcdt-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pbrsb-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='psdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='serialize'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vaes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='hle'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='rtm'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512bw'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512cd'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512dq'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512f'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='avx512vl'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='invpcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pcid'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='pku'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='mpx'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='core-capability'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='split-lock-detect'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='cldemote'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='erms'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='gfni'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdir64b'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='movdiri'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='xsaves'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='athlon-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='core2duo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='coreduo-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='n270-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='ss'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <blockers model='phenom-v1'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnow'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <feature name='3dnowext'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </blockers>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </mode>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </cpu>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <memoryBacking supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <enum name='sourceType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>anonymous</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <value>memfd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </memoryBacking>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <disk supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='diskDevice'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>disk</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cdrom</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>floppy</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>lun</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>fdc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>sata</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </disk>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <graphics supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vnc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egl-headless</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </graphics>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <video supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='modelType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vga</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>cirrus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>none</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>bochs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ramfb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </video>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hostdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='mode'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>subsystem</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='startupPolicy'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>mandatory</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>requisite</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>optional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='subsysType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pci</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>scsi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='capsType'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='pciBackend'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hostdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <rng supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtio-non-transitional</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>random</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>egd</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </rng>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <filesystem supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='driverType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>path</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>handle</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>virtiofs</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </filesystem>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <tpm supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-tis</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tpm-crb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emulator</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>external</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendVersion'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>2.0</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </tpm>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <redirdev supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='bus'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>usb</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </redirdev>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <channel supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </channel>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <crypto supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendModel'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>builtin</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </crypto>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <interface supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='backendType'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>default</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>passt</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </interface>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <panic supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='model'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>isa</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>hyperv</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </panic>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <console supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='type'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>null</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vc</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pty</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dev</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>file</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>pipe</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stdio</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>udp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tcp</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>unix</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>qemu-vdagent</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>dbus</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </console>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </devices>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   <features>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <gic supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <genid supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <backup supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <async-teardown supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <ps2 supported='yes'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sev supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <sgx supported='no'/>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <hyperv supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='features'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>relaxed</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vapic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>spinlocks</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vpindex</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>runtime</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>synic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>stimer</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reset</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>vendor_id</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>frequencies</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>reenlightenment</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tlbflush</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>ipi</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>avic</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>emsr_bitmap</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>xmm_input</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </defaults>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </hyperv>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     <launchSecurity supported='yes'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       <enum name='sectype'>
Nov 24 14:24:28 compute-0 nova_compute[186155]:         <value>tdx</value>
Nov 24 14:24:28 compute-0 nova_compute[186155]:       </enum>
Nov 24 14:24:28 compute-0 nova_compute[186155]:     </launchSecurity>
Nov 24 14:24:28 compute-0 nova_compute[186155]:   </features>
Nov 24 14:24:28 compute-0 nova_compute[186155]: </domainCapabilities>
Nov 24 14:24:28 compute-0 nova_compute[186155]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.318 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.318 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.319 186159 DEBUG nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.319 186159 INFO nova.virt.libvirt.host [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Secure Boot support detected
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.320 186159 INFO nova.virt.libvirt.driver [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.320 186159 INFO nova.virt.libvirt.driver [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.329 186159 DEBUG nova.virt.libvirt.driver [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.392 186159 INFO nova.virt.node [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Determined node identity 08b6207d-b34e-43d6-b1a7-1741d75aa10b from /var/lib/nova/compute_id
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.416 186159 WARNING nova.compute.manager [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Compute nodes ['08b6207d-b34e-43d6-b1a7-1741d75aa10b'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 24 14:24:28 compute-0 sudo[187010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsdrnsoeiqzllztabaeovqbzlihbaaei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994268.0858066-1537-167783896753042/AnsiballZ_systemd.py'
Nov 24 14:24:28 compute-0 sudo[187010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.454 186159 INFO nova.compute.manager [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.485 186159 WARNING nova.compute.manager [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.485 186159 DEBUG oslo_concurrency.lockutils [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.485 186159 DEBUG oslo_concurrency.lockutils [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.486 186159 DEBUG oslo_concurrency.lockutils [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.486 186159 DEBUG nova.compute.resource_tracker [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:24:28 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Nov 24 14:24:28 compute-0 systemd[1]: Started libvirt nodedev daemon.
Nov 24 14:24:28 compute-0 python3.9[187012]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:24:28 compute-0 systemd[1]: Stopping nova_compute container...
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.815 186159 WARNING nova.virt.libvirt.driver [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.816 186159 DEBUG nova.compute.resource_tracker [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6165MB free_disk=73.66886138916016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.817 186159 DEBUG oslo_concurrency.lockutils [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.817 186159 DEBUG oslo_concurrency.lockutils [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.822 186159 DEBUG oslo_concurrency.lockutils [None req-fe1efb4a-72c1-47fb-b145-940ef5c9308c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.822 186159 DEBUG oslo_concurrency.lockutils [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.823 186159 DEBUG oslo_concurrency.lockutils [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:24:28 compute-0 nova_compute[186155]: 2025-11-24 14:24:28.823 186159 DEBUG oslo_concurrency.lockutils [None req-26db9857-a540-4cf8-9e3b-f4f84984d1b0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:24:29 compute-0 virtqemud[186719]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 24 14:24:29 compute-0 virtqemud[186719]: hostname: compute-0
Nov 24 14:24:29 compute-0 virtqemud[186719]: End of file while reading data: Input/output error
Nov 24 14:24:29 compute-0 systemd[1]: libpod-c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f.scope: Deactivated successfully.
Nov 24 14:24:29 compute-0 systemd[1]: libpod-c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f.scope: Consumed 3.045s CPU time.
Nov 24 14:24:29 compute-0 podman[187039]: 2025-11-24 14:24:29.209377095 +0000 UTC m=+0.433467700 container died c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:24:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f-userdata-shm.mount: Deactivated successfully.
Nov 24 14:24:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401-merged.mount: Deactivated successfully.
Nov 24 14:24:29 compute-0 podman[187039]: 2025-11-24 14:24:29.294149047 +0000 UTC m=+0.518239652 container cleanup c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:24:29 compute-0 podman[187039]: nova_compute
Nov 24 14:24:29 compute-0 podman[187055]: 2025-11-24 14:24:29.32260542 +0000 UTC m=+0.084165136 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:24:29 compute-0 podman[187088]: nova_compute
Nov 24 14:24:29 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 24 14:24:29 compute-0 systemd[1]: Stopped nova_compute container.
Nov 24 14:24:29 compute-0 systemd[1]: Starting nova_compute container...
Nov 24 14:24:29 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:24:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d6af88161ac5fd28e47cdd6aeb4421cad5a7069e278315ea97960e68e0c401/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:29 compute-0 podman[187102]: 2025-11-24 14:24:29.506990936 +0000 UTC m=+0.105103754 container init c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3)
Nov 24 14:24:29 compute-0 podman[187102]: 2025-11-24 14:24:29.518426306 +0000 UTC m=+0.116539114 container start c3c707f469f5588546bf955c16c3c2d5c3bc9665d71aca42046a5a27891f6e7f (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:24:29 compute-0 podman[187102]: nova_compute
Nov 24 14:24:29 compute-0 nova_compute[187118]: + sudo -E kolla_set_configs
Nov 24 14:24:29 compute-0 systemd[1]: Started nova_compute container.
Nov 24 14:24:29 compute-0 sudo[187010]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Validating config file
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying service configuration files
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /etc/ceph
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Creating directory /etc/ceph
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /etc/ceph
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Writing out command to execute
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:29 compute-0 nova_compute[187118]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 24 14:24:29 compute-0 nova_compute[187118]: ++ cat /run_command
Nov 24 14:24:29 compute-0 nova_compute[187118]: + CMD=nova-compute
Nov 24 14:24:29 compute-0 nova_compute[187118]: + ARGS=
Nov 24 14:24:29 compute-0 nova_compute[187118]: + sudo kolla_copy_cacerts
Nov 24 14:24:29 compute-0 nova_compute[187118]: + [[ ! -n '' ]]
Nov 24 14:24:29 compute-0 nova_compute[187118]: + . kolla_extend_start
Nov 24 14:24:29 compute-0 nova_compute[187118]: + echo 'Running command: '\''nova-compute'\'''
Nov 24 14:24:29 compute-0 nova_compute[187118]: Running command: 'nova-compute'
Nov 24 14:24:29 compute-0 nova_compute[187118]: + umask 0022
Nov 24 14:24:29 compute-0 nova_compute[187118]: + exec nova-compute
Nov 24 14:24:30 compute-0 sudo[187279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxhdkuwwbpfklzxwkujcibjfjfxkbvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994269.8088791-1546-31839332737418/AnsiballZ_podman_container.py'
Nov 24 14:24:30 compute-0 sudo[187279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:30 compute-0 python3.9[187281]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 24 14:24:30 compute-0 systemd[1]: Started libpod-conmon-6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897.scope.
Nov 24 14:24:30 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3aa099b75883e9d4b19564f8c9f8ab9a5f2858af8d1c2359534857218eedeb28/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3aa099b75883e9d4b19564f8c9f8ab9a5f2858af8d1c2359534857218eedeb28/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3aa099b75883e9d4b19564f8c9f8ab9a5f2858af8d1c2359534857218eedeb28/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 24 14:24:30 compute-0 podman[187307]: 2025-11-24 14:24:30.733283692 +0000 UTC m=+0.124958513 container init 6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute_init, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:24:30 compute-0 podman[187307]: 2025-11-24 14:24:30.74464015 +0000 UTC m=+0.136314941 container start 6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 14:24:30 compute-0 python3.9[187281]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Applying nova statedir ownership
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 24 14:24:30 compute-0 nova_compute_init[187328]: INFO:nova_statedir:Nova statedir ownership complete
Nov 24 14:24:30 compute-0 systemd[1]: libpod-6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897.scope: Deactivated successfully.
Nov 24 14:24:30 compute-0 podman[187326]: 2025-11-24 14:24:30.8028314 +0000 UTC m=+0.033436409 container died 6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute_init, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:24:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897-userdata-shm.mount: Deactivated successfully.
Nov 24 14:24:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-3aa099b75883e9d4b19564f8c9f8ab9a5f2858af8d1c2359534857218eedeb28-merged.mount: Deactivated successfully.
Nov 24 14:24:30 compute-0 podman[187341]: 2025-11-24 14:24:30.873052257 +0000 UTC m=+0.059758783 container cleanup 6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076, name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:b6e1e8a249d36ef36c6ac4170af1e043dda1ccc0f9672832d3ff151bf3533076', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Nov 24 14:24:30 compute-0 systemd[1]: libpod-conmon-6910651536e7abafe3f733db63f88c0a916bb022a1ceab041be50540b6f83897.scope: Deactivated successfully.
Nov 24 14:24:30 compute-0 sudo[187279]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:31 compute-0 sshd-session[159031]: Connection closed by 192.168.122.30 port 40430
Nov 24 14:24:31 compute-0 sshd-session[159028]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:24:31 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Nov 24 14:24:31 compute-0 systemd[1]: session-24.scope: Consumed 1min 49.226s CPU time.
Nov 24 14:24:31 compute-0 systemd-logind[807]: Session 24 logged out. Waiting for processes to exit.
Nov 24 14:24:31 compute-0 systemd-logind[807]: Removed session 24.
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.620 187122 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.620 187122 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.621 187122 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.621 187122 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.757 187122 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.785 187122 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:24:31 compute-0 nova_compute[187118]: 2025-11-24 14:24:31.786 187122 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.240 187122 INFO nova.virt.driver [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.374 187122 INFO nova.compute.provider_config [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.389 187122 DEBUG oslo_concurrency.lockutils [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.390 187122 DEBUG oslo_concurrency.lockutils [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.390 187122 DEBUG oslo_concurrency.lockutils [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.391 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.391 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.391 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.391 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.391 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.392 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.392 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.392 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.392 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.392 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.393 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.393 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.393 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.393 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.393 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.394 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.394 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.394 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.394 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.394 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.395 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.395 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.395 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.395 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.395 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.396 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.396 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.396 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.396 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.397 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.397 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.397 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.397 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.397 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.398 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.398 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.398 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.398 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.399 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.399 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.399 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.399 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.400 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.400 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.400 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.400 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.400 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.401 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.401 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.401 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.401 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.401 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.402 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.402 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.402 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.402 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.402 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.402 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.403 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.403 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.403 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.403 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.403 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.404 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.404 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.404 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.404 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.404 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.405 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.405 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.405 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.405 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.405 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.406 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.406 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.406 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.406 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.406 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.407 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.407 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.407 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.407 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.407 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.408 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.408 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.408 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.408 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.408 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.409 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.409 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.409 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.409 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.409 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.410 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.411 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.411 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.411 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.411 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.411 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.412 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.412 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.412 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.412 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.412 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.413 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.413 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.413 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.413 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.413 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.414 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.414 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.414 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.414 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.414 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.415 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.415 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.415 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.415 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.415 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.415 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.416 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.417 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.418 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.419 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.419 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.419 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.419 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.419 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.420 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.420 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.420 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.420 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.421 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.421 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.421 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.421 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.421 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.422 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.422 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.422 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.422 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.423 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.423 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.423 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.423 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.423 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.424 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.424 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.424 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.424 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.425 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.425 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.425 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.425 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.425 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.426 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.426 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.426 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.426 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.426 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.427 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.427 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.427 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.427 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.428 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.428 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.428 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.428 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.428 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.429 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.429 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.429 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.429 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.429 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.430 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.430 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.430 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.430 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.430 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.431 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.431 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.431 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.431 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.431 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.432 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.432 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.432 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.432 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.432 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.433 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.433 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.433 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.433 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.433 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.433 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.434 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.435 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.436 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.437 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.438 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.439 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.440 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.441 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.442 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.442 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.442 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.442 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.442 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.442 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.443 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.444 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.445 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.446 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.447 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.448 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.449 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.450 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.451 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.452 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.453 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.453 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.453 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.453 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.453 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.453 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.454 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.454 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.454 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.454 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.454 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.454 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.455 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.456 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.457 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.458 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.459 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.460 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.461 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.461 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.461 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.461 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.461 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.461 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.462 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.463 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.463 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.463 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.463 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.463 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.463 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.464 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.465 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.466 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.467 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.468 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.469 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.469 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.469 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.469 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.469 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.469 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.470 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.471 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.471 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.471 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.472 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.472 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.472 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.472 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.473 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.474 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.474 187122 WARNING oslo_config.cfg [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 24 14:24:32 compute-0 nova_compute[187118]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 24 14:24:32 compute-0 nova_compute[187118]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 24 14:24:32 compute-0 nova_compute[187118]: and ``live_migration_inbound_addr`` respectively.
Nov 24 14:24:32 compute-0 nova_compute[187118]: ).  Its value may be silently ignored in the future.
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.474 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 podman[187393]: 2025-11-24 14:24:32.474807977 +0000 UTC m=+0.080952769 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.475 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.475 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.475 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.475 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.476 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.476 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.476 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.476 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.476 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.476 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.477 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.478 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.479 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.479 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.479 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.479 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.479 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.479 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.480 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.480 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.480 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.480 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.480 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.481 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.482 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.482 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.482 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.482 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.482 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.482 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.483 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.483 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.483 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.483 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.483 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.483 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.484 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.485 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.485 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.485 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.485 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.485 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.485 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.486 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.487 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.487 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.487 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.487 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.487 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.487 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.488 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.489 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.489 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.489 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.489 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.489 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.489 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.490 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.491 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.491 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.491 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.491 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.491 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.491 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.492 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.492 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.492 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.492 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.492 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.492 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.493 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.493 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.493 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.493 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.493 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.493 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.494 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.495 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.496 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.496 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.496 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.496 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.496 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.496 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.497 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.497 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.497 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.497 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.497 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.498 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.498 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.498 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.498 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.498 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.499 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.499 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.499 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.499 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.499 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.499 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.500 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.500 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.500 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.500 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.500 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.500 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.501 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.502 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.503 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.503 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.503 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.503 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.503 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.503 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.504 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.504 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.504 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.504 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.504 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.504 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.505 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.506 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.506 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.506 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.506 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.506 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.506 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.507 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.508 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.508 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.508 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.508 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.508 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.508 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.509 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.510 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.511 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.512 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.513 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.513 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.513 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.513 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.513 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.513 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.514 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.514 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.514 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.514 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.514 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.514 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.515 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.515 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.515 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.515 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.515 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.516 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.516 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.516 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.516 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.516 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.516 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.517 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.517 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.517 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.517 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.517 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.518 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.519 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.519 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.519 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.519 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.519 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.519 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.520 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.521 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.521 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.521 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.521 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.521 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.521 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.522 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.523 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.524 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.525 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.525 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.525 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.525 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.525 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.525 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.526 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.527 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.528 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.529 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.530 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.531 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.532 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.533 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.534 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.535 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.536 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.536 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.536 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.536 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.536 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.536 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.537 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.538 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.538 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.538 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.538 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.538 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.538 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.539 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.540 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.540 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.540 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.540 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.540 187122 DEBUG oslo_service.service [None req-6de7d0e8-e9f9-46ff-8715-928bb46ca98a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.542 187122 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.560 187122 INFO nova.virt.node [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Determined node identity 08b6207d-b34e-43d6-b1a7-1741d75aa10b from /var/lib/nova/compute_id
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.560 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.561 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.561 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.562 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.577 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f8defe6f4c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.580 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f8defe6f4c0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.581 187122 INFO nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Connection event '1' reason 'None'
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.586 187122 INFO nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Libvirt host capabilities <capabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]: 
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <host>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <uuid>29821a9d-05ed-4e9e-b48f-8dca86832284</uuid>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <arch>x86_64</arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model>EPYC-Rome-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <vendor>AMD</vendor>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <microcode version='16777317'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <signature family='23' model='49' stepping='0'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <maxphysaddr mode='emulate' bits='40'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='x2apic'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='tsc-deadline'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='osxsave'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='hypervisor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='tsc_adjust'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='spec-ctrl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='stibp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='arch-capabilities'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='cmp_legacy'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='topoext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='virt-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='lbrv'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='tsc-scale'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='vmcb-clean'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='pause-filter'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='pfthreshold'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='svme-addr-chk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='rdctl-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='skip-l1dfl-vmentry'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='mds-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature name='pschange-mc-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <pages unit='KiB' size='4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <pages unit='KiB' size='2048'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <pages unit='KiB' size='1048576'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <power_management>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <suspend_mem/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <suspend_disk/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <suspend_hybrid/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </power_management>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <iommu support='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <migration_features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <live/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <uri_transports>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <uri_transport>tcp</uri_transport>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <uri_transport>rdma</uri_transport>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </uri_transports>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </migration_features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <topology>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <cells num='1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <cell id='0'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           <memory unit='KiB'>7864324</memory>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           <pages unit='KiB' size='4'>1966081</pages>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           <pages unit='KiB' size='2048'>0</pages>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           <pages unit='KiB' size='1048576'>0</pages>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           <distances>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <sibling id='0' value='10'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           </distances>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           <cpus num='8'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:           </cpus>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         </cell>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </cells>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </topology>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <cache>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </cache>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <secmodel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model>selinux</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <doi>0</doi>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </secmodel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <secmodel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model>dac</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <doi>0</doi>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <baselabel type='kvm'>+107:+107</baselabel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <baselabel type='qemu'>+107:+107</baselabel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </secmodel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </host>
Nov 24 14:24:32 compute-0 nova_compute[187118]: 
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <guest>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <os_type>hvm</os_type>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <arch name='i686'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <wordsize>32</wordsize>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <domain type='qemu'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <domain type='kvm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <pae/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <nonpae/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <acpi default='on' toggle='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <apic default='on' toggle='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <cpuselection/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <deviceboot/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <disksnapshot default='on' toggle='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <externalSnapshot/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </guest>
Nov 24 14:24:32 compute-0 nova_compute[187118]: 
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <guest>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <os_type>hvm</os_type>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <arch name='x86_64'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <wordsize>64</wordsize>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <domain type='qemu'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <domain type='kvm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <acpi default='on' toggle='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <apic default='on' toggle='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <cpuselection/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <deviceboot/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <disksnapshot default='on' toggle='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <externalSnapshot/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </guest>
Nov 24 14:24:32 compute-0 nova_compute[187118]: 
Nov 24 14:24:32 compute-0 nova_compute[187118]: </capabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]: 
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.591 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.594 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 24 14:24:32 compute-0 nova_compute[187118]: <domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <domain>kvm</domain>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <arch>i686</arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <vcpu max='240'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <iothreads supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <os supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='firmware'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <loader supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>rom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pflash</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='readonly'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>yes</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='secure'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </loader>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </os>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='maximumMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <vendor>AMD</vendor>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='succor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='custom' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-128'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-256'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-512'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <memoryBacking supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='sourceType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>anonymous</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>memfd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </memoryBacking>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <disk supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='diskDevice'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>disk</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cdrom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>floppy</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>lun</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ide</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>fdc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>sata</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <graphics supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vnc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egl-headless</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <video supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='modelType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vga</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cirrus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>none</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>bochs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ramfb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </video>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hostdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='mode'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>subsystem</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='startupPolicy'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>mandatory</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>requisite</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>optional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='subsysType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pci</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='capsType'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='pciBackend'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hostdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <rng supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>random</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <filesystem supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='driverType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>path</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>handle</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtiofs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </filesystem>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <tpm supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-tis</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-crb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emulator</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>external</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendVersion'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>2.0</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </tpm>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <redirdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </redirdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <channel supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </channel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <crypto supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </crypto>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <interface supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>passt</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <panic supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>isa</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>hyperv</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </panic>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <console supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>null</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dev</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pipe</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stdio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>udp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tcp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu-vdagent</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </console>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <gic supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <genid supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backup supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <async-teardown supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <ps2 supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sev supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sgx supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hyperv supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='features'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>relaxed</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vapic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>spinlocks</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vpindex</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>runtime</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>synic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stimer</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reset</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vendor_id</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>frequencies</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reenlightenment</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tlbflush</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ipi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>avic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emsr_bitmap</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>xmm_input</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hyperv>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <launchSecurity supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='sectype'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tdx</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </launchSecurity>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </features>
Nov 24 14:24:32 compute-0 nova_compute[187118]: </domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.597 187122 DEBUG nova.virt.libvirt.volume.mount [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.607 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 24 14:24:32 compute-0 nova_compute[187118]: <domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <domain>kvm</domain>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <arch>i686</arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <vcpu max='4096'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <iothreads supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <os supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='firmware'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <loader supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>rom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pflash</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='readonly'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>yes</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='secure'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </loader>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </os>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='maximumMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <vendor>AMD</vendor>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='succor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='custom' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-128'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-256'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-512'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <memoryBacking supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='sourceType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>anonymous</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>memfd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </memoryBacking>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <disk supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='diskDevice'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>disk</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cdrom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>floppy</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>lun</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>fdc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>sata</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <graphics supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vnc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egl-headless</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <video supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='modelType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vga</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cirrus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>none</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>bochs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ramfb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </video>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hostdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='mode'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>subsystem</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='startupPolicy'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>mandatory</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>requisite</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>optional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='subsysType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pci</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='capsType'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='pciBackend'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hostdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <rng supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>random</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <filesystem supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='driverType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>path</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>handle</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtiofs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </filesystem>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <tpm supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-tis</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-crb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emulator</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>external</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendVersion'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>2.0</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </tpm>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <redirdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </redirdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <channel supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </channel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <crypto supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </crypto>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <interface supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>passt</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <panic supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>isa</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>hyperv</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </panic>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <console supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>null</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dev</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pipe</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stdio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>udp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tcp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu-vdagent</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </console>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <gic supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <genid supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backup supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <async-teardown supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <ps2 supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sev supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sgx supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hyperv supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='features'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>relaxed</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vapic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>spinlocks</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vpindex</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>runtime</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>synic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stimer</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reset</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vendor_id</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>frequencies</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reenlightenment</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tlbflush</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ipi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>avic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emsr_bitmap</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>xmm_input</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hyperv>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <launchSecurity supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='sectype'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tdx</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </launchSecurity>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </features>
Nov 24 14:24:32 compute-0 nova_compute[187118]: </domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.636 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.639 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 24 14:24:32 compute-0 nova_compute[187118]: <domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <domain>kvm</domain>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <machine>pc-i440fx-rhel7.6.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <arch>x86_64</arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <vcpu max='240'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <iothreads supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <os supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='firmware'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <loader supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>rom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pflash</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='readonly'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>yes</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='secure'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </loader>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </os>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='maximumMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <vendor>AMD</vendor>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='succor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='custom' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-128'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-256'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-512'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <memoryBacking supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='sourceType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>anonymous</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>memfd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </memoryBacking>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <disk supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='diskDevice'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>disk</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cdrom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>floppy</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>lun</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ide</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>fdc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>sata</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <graphics supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vnc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egl-headless</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <video supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='modelType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vga</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cirrus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>none</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>bochs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ramfb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </video>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hostdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='mode'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>subsystem</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='startupPolicy'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>mandatory</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>requisite</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>optional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='subsysType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pci</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='capsType'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='pciBackend'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hostdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <rng supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>random</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <filesystem supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='driverType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>path</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>handle</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtiofs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </filesystem>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <tpm supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-tis</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-crb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emulator</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>external</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendVersion'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>2.0</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </tpm>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <redirdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </redirdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <channel supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </channel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <crypto supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </crypto>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <interface supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>passt</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <panic supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>isa</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>hyperv</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </panic>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <console supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>null</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dev</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pipe</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stdio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>udp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tcp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu-vdagent</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </console>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <gic supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <genid supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backup supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <async-teardown supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <ps2 supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sev supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sgx supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hyperv supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='features'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>relaxed</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vapic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>spinlocks</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vpindex</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>runtime</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>synic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stimer</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reset</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vendor_id</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>frequencies</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reenlightenment</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tlbflush</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ipi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>avic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emsr_bitmap</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>xmm_input</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hyperv>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <launchSecurity supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='sectype'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tdx</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </launchSecurity>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </features>
Nov 24 14:24:32 compute-0 nova_compute[187118]: </domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.711 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 24 14:24:32 compute-0 nova_compute[187118]: <domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <path>/usr/libexec/qemu-kvm</path>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <domain>kvm</domain>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <machine>pc-q35-rhel9.8.0</machine>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <arch>x86_64</arch>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <vcpu max='4096'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <iothreads supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <os supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='firmware'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>efi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <loader supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>rom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pflash</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='readonly'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>yes</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='secure'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>yes</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>no</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </loader>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </os>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-passthrough' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='hostPassthroughMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='maximum' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='maximumMigratable'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>on</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>off</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='host-model' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <vendor>AMD</vendor>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <maxphysaddr mode='passthrough' limit='40'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='x2apic'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-deadline'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='hypervisor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc_adjust'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='spec-ctrl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='stibp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='cmp_legacy'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='overflow-recov'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='succor'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='amd-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='virt-ssbd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lbrv'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='tsc-scale'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='vmcb-clean'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='flushbyasid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pause-filter'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='pfthreshold'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='svme-addr-chk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <feature policy='disable' name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <mode name='custom' supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Broadwell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cascadelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Cooperlake-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Denverton-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Dhyana-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Genoa-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='auto-ibrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Milan-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amd-psfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='no-nested-data-bp'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='null-sel-clr-base'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='stibp-always-on'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-Rome-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='EPYC-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='GraniteRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-128'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-256'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx10-512'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='prefetchiti'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Haswell-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-noTSX'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v6'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Icelake-Server-v7'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='IvyBridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='KnightsMill-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4fmaps'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-4vnniw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512er'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512pf'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G4-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Opteron_G5-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fma4'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tbm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xop'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SapphireRapids-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='amx-tile'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-bf16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-fp16'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512-vpopcntdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bitalg'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vbmi2'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrc'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fzrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='la57'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='taa-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='tsx-ldtrk'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xfd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='SierraForest-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ifma'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-ne-convert'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx-vnni-int8'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='bus-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cmpccxadd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fbsdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='fsrs'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ibrs-all'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mcdt-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pbrsb-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='psdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='sbdr-ssdp-no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='serialize'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vaes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='vpclmulqdq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Client-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='hle'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='rtm'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Skylake-Server-v5'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512bw'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512cd'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512dq'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512f'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='avx512vl'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='invpcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pcid'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='pku'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='mpx'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v2'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v3'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='core-capability'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='split-lock-detect'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='Snowridge-v4'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='cldemote'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='erms'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='gfni'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdir64b'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='movdiri'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='xsaves'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='athlon-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='core2duo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='coreduo-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='n270-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='ss'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <blockers model='phenom-v1'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnow'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <feature name='3dnowext'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </blockers>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </mode>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <memoryBacking supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <enum name='sourceType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>anonymous</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <value>memfd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </memoryBacking>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <disk supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='diskDevice'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>disk</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cdrom</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>floppy</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>lun</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>fdc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>sata</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <graphics supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vnc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egl-headless</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <video supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='modelType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vga</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>cirrus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>none</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>bochs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ramfb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </video>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hostdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='mode'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>subsystem</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='startupPolicy'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>mandatory</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>requisite</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>optional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='subsysType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pci</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>scsi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='capsType'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='pciBackend'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hostdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <rng supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtio-non-transitional</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>random</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>egd</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <filesystem supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='driverType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>path</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>handle</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>virtiofs</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </filesystem>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <tpm supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-tis</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tpm-crb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emulator</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>external</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendVersion'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>2.0</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </tpm>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <redirdev supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='bus'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>usb</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </redirdev>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <channel supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </channel>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <crypto supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendModel'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>builtin</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </crypto>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <interface supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='backendType'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>default</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>passt</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <panic supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='model'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>isa</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>hyperv</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </panic>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <console supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='type'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>null</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vc</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pty</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dev</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>file</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>pipe</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stdio</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>udp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tcp</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>unix</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>qemu-vdagent</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>dbus</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </console>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   <features>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <gic supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <vmcoreinfo supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <genid supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backingStoreInput supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <backup supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <async-teardown supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <ps2 supported='yes'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sev supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <sgx supported='no'/>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <hyperv supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='features'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>relaxed</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vapic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>spinlocks</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vpindex</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>runtime</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>synic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>stimer</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reset</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>vendor_id</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>frequencies</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>reenlightenment</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tlbflush</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>ipi</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>avic</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>emsr_bitmap</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>xmm_input</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <spinlocks>4095</spinlocks>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <stimer_direct>on</stimer_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_direct>on</tlbflush_direct>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <tlbflush_extended>on</tlbflush_extended>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <vendor_id>Linux KVM Hv</vendor_id>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </defaults>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </hyperv>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     <launchSecurity supported='yes'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       <enum name='sectype'>
Nov 24 14:24:32 compute-0 nova_compute[187118]:         <value>tdx</value>
Nov 24 14:24:32 compute-0 nova_compute[187118]:       </enum>
Nov 24 14:24:32 compute-0 nova_compute[187118]:     </launchSecurity>
Nov 24 14:24:32 compute-0 nova_compute[187118]:   </features>
Nov 24 14:24:32 compute-0 nova_compute[187118]: </domainCapabilities>
Nov 24 14:24:32 compute-0 nova_compute[187118]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.790 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.791 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.791 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.791 187122 INFO nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Secure Boot support detected
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.794 187122 INFO nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.794 187122 INFO nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.807 187122 DEBUG nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.828 187122 INFO nova.virt.node [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Determined node identity 08b6207d-b34e-43d6-b1a7-1741d75aa10b from /var/lib/nova/compute_id
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.845 187122 WARNING nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Compute nodes ['08b6207d-b34e-43d6-b1a7-1741d75aa10b'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.878 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.900 187122 WARNING nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.901 187122 DEBUG oslo_concurrency.lockutils [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.901 187122 DEBUG oslo_concurrency.lockutils [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.901 187122 DEBUG oslo_concurrency.lockutils [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:24:32 compute-0 nova_compute[187118]: 2025-11-24 14:24:32.901 187122 DEBUG nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.038 187122 WARNING nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.038 187122 DEBUG nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6176MB free_disk=73.66699600219727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.039 187122 DEBUG oslo_concurrency.lockutils [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.039 187122 DEBUG oslo_concurrency.lockutils [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.051 187122 WARNING nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] No compute node record for compute-0.ctlplane.example.com:08b6207d-b34e-43d6-b1a7-1741d75aa10b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 08b6207d-b34e-43d6-b1a7-1741d75aa10b could not be found.
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.070 187122 INFO nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 08b6207d-b34e-43d6-b1a7-1741d75aa10b
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.146 187122 DEBUG nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:24:33 compute-0 nova_compute[187118]: 2025-11-24 14:24:33.147 187122 DEBUG nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.094 187122 INFO nova.scheduler.client.report [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [req-459a22b7-a672-46ec-b4b8-ad8415f0c326] Created resource provider record via placement API for resource provider with UUID 08b6207d-b34e-43d6-b1a7-1741d75aa10b and name compute-0.ctlplane.example.com.
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.480 187122 DEBUG nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 24 14:24:34 compute-0 nova_compute[187118]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.480 187122 INFO nova.virt.libvirt.host [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] kernel doesn't support AMD SEV
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.482 187122 DEBUG nova.compute.provider_tree [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.482 187122 DEBUG nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.553 187122 DEBUG nova.scheduler.client.report [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Updated inventory for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.553 187122 DEBUG nova.compute.provider_tree [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Updating resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.554 187122 DEBUG nova.compute.provider_tree [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.680 187122 DEBUG nova.compute.provider_tree [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Updating resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.701 187122 DEBUG nova.compute.resource_tracker [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.701 187122 DEBUG oslo_concurrency.lockutils [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.701 187122 DEBUG nova.service [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.751 187122 DEBUG nova.service [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Nov 24 14:24:34 compute-0 nova_compute[187118]: 2025-11-24 14:24:34.752 187122 DEBUG nova.servicegroup.drivers.db [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Nov 24 14:24:36 compute-0 sshd-session[187434]: Accepted publickey for zuul from 192.168.122.30 port 38826 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:24:36 compute-0 systemd-logind[807]: New session 26 of user zuul.
Nov 24 14:24:36 compute-0 systemd[1]: Started Session 26 of User zuul.
Nov 24 14:24:36 compute-0 sshd-session[187434]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:24:37 compute-0 python3.9[187587]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 24 14:24:38 compute-0 sudo[187741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cplvlwvxxdpwltaetstskuacdalflrat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994278.2508898-36-187116827109687/AnsiballZ_systemd_service.py'
Nov 24 14:24:38 compute-0 sudo[187741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:39 compute-0 python3.9[187743]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:24:39 compute-0 systemd[1]: Reloading.
Nov 24 14:24:39 compute-0 systemd-sysv-generator[187772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:24:39 compute-0 systemd-rc-local-generator[187769]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:24:39 compute-0 sudo[187741]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:40 compute-0 podman[187902]: 2025-11-24 14:24:40.190638691 +0000 UTC m=+0.126930032 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 14:24:40 compute-0 python3.9[187938]: ansible-ansible.builtin.service_facts Invoked
Nov 24 14:24:40 compute-0 network[187971]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 24 14:24:40 compute-0 network[187972]: 'network-scripts' will be removed from distribution in near future.
Nov 24 14:24:40 compute-0 network[187973]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 24 14:24:44 compute-0 sudo[188245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsozguhkeiwmyoolcwgldgkrrptjedqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994283.8218782-55-58873928962379/AnsiballZ_systemd_service.py'
Nov 24 14:24:44 compute-0 sudo[188245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:44 compute-0 python3.9[188247]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:24:44 compute-0 sudo[188245]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:45 compute-0 sudo[188398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egrtjpfzktvveahpgdnxcxiknpxitnsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994284.674888-65-259870738391366/AnsiballZ_file.py'
Nov 24 14:24:45 compute-0 sudo[188398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:45 compute-0 python3.9[188400]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:45 compute-0 sudo[188398]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:45 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:24:45 compute-0 sudo[188551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pndbvidtnsmjhbjhatgsyeqhffdrnyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994285.50769-73-263331806162322/AnsiballZ_file.py'
Nov 24 14:24:45 compute-0 sudo[188551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:45 compute-0 python3.9[188553]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:45 compute-0 sudo[188551]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:46 compute-0 sudo[188703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tleisxlgqlpyxsochwhgikimrbgqyflo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994286.251693-82-117549685247696/AnsiballZ_command.py'
Nov 24 14:24:46 compute-0 sudo[188703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:46 compute-0 python3.9[188705]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:24:46 compute-0 sudo[188703]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:47 compute-0 python3.9[188857]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 14:24:48 compute-0 sudo[189007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzhdiapbdlannwrpqujgggmcsdtepzbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994288.0004158-100-104408997433845/AnsiballZ_systemd_service.py'
Nov 24 14:24:48 compute-0 sudo[189007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:48 compute-0 python3.9[189009]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:24:48 compute-0 systemd[1]: Reloading.
Nov 24 14:24:48 compute-0 systemd-rc-local-generator[189035]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:24:48 compute-0 systemd-sysv-generator[189038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:24:48 compute-0 sudo[189007]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:49 compute-0 sudo[189194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsgndrxhrwshrfgxibfazwbeqpdajtgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994289.0933301-108-236374749012206/AnsiballZ_command.py'
Nov 24 14:24:49 compute-0 sudo[189194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:49 compute-0 python3.9[189196]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:24:49 compute-0 sudo[189194]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:50 compute-0 sudo[189347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbqlanlcosgcgzjfrrzmypzurxpolbqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994289.9091709-117-153022342881310/AnsiballZ_file.py'
Nov 24 14:24:50 compute-0 sudo[189347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:50 compute-0 python3.9[189349]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:50 compute-0 sudo[189347]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:51 compute-0 python3.9[189499]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:24:51 compute-0 python3.9[189651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:52 compute-0 python3.9[189772]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994291.4496763-133-147725984895602/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:24:53 compute-0 sudo[189922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzwkizzwwucladpxmlostsoagnqidmdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994292.8217611-148-221973473656740/AnsiballZ_group.py'
Nov 24 14:24:53 compute-0 sudo[189922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:53 compute-0 python3.9[189924]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 24 14:24:53 compute-0 sudo[189922]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:54 compute-0 sudo[190074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgfhntjokfnynymwgusadylqdkefgxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994293.808683-159-155672838103707/AnsiballZ_getent.py'
Nov 24 14:24:54 compute-0 sudo[190074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:54 compute-0 python3.9[190076]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 24 14:24:54 compute-0 sudo[190074]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:54 compute-0 sudo[190227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vedbbdstxwytvvnsnkvuoaeyqbdvoyux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994294.6248415-167-222248260148911/AnsiballZ_group.py'
Nov 24 14:24:54 compute-0 sudo[190227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:55 compute-0 python3.9[190229]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 24 14:24:55 compute-0 groupadd[190230]: group added to /etc/group: name=ceilometer, GID=42405
Nov 24 14:24:55 compute-0 groupadd[190230]: group added to /etc/gshadow: name=ceilometer
Nov 24 14:24:55 compute-0 groupadd[190230]: new group: name=ceilometer, GID=42405
Nov 24 14:24:55 compute-0 sudo[190227]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:56 compute-0 sudo[190385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfocdshejcbphwvuzogozggywtzfupro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994295.6716492-175-201716124725857/AnsiballZ_user.py'
Nov 24 14:24:56 compute-0 sudo[190385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:24:56 compute-0 python3.9[190387]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 24 14:24:56 compute-0 useradd[190389]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Nov 24 14:24:56 compute-0 useradd[190389]: add 'ceilometer' to group 'libvirt'
Nov 24 14:24:56 compute-0 useradd[190389]: add 'ceilometer' to shadow group 'libvirt'
Nov 24 14:24:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:24:56.649 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:24:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:24:56.650 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:24:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:24:56.650 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:24:56 compute-0 sudo[190385]: pam_unix(sudo:session): session closed for user root
Nov 24 14:24:57 compute-0 python3.9[190545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:58 compute-0 python3.9[190666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763994297.3327317-201-128490882359806/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:58 compute-0 python3.9[190816]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:24:59 compute-0 python3.9[190937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763994298.4913387-201-178782959546253/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:24:59 compute-0 podman[190938]: 2025-11-24 14:24:59.44980567 +0000 UTC m=+0.058351865 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:25:00 compute-0 python3.9[191108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:00 compute-0 python3.9[191229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1763994299.5694377-201-255664007973208/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:01 compute-0 python3.9[191379]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:25:02 compute-0 python3.9[191531]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:25:02 compute-0 podman[191657]: 2025-11-24 14:25:02.620574553 +0000 UTC m=+0.076396715 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:25:02 compute-0 python3.9[191699]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:03 compute-0 python3.9[191824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994302.2774124-260-19784685449556/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:03 compute-0 python3.9[191974]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:04 compute-0 python3.9[192050]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:05 compute-0 python3.9[192200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:05 compute-0 python3.9[192321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994304.529383-260-179010276663862/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=5e3924a60212b2dd44e3cbce4f857c6716643f3c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:06 compute-0 python3.9[192471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:06 compute-0 python3.9[192592]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994305.8709517-260-146445370410277/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:07 compute-0 python3.9[192742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:07 compute-0 python3.9[192863]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994307.005974-260-169284931353581/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:08 compute-0 python3.9[193013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:09 compute-0 python3.9[193134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994308.0823312-260-84901483005152/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:09 compute-0 python3.9[193284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:10 compute-0 python3.9[193405]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994309.1980197-260-104341614232223/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:10 compute-0 podman[193454]: 2025-11-24 14:25:10.478431231 +0000 UTC m=+0.086683270 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:25:10 compute-0 python3.9[193581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:11 compute-0 python3.9[193702]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994310.3637366-260-129758520099458/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:12 compute-0 python3.9[193852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:12 compute-0 python3.9[193973]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994311.560731-260-84909256790410/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:13 compute-0 python3.9[194123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:13 compute-0 python3.9[194244]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994312.7802658-260-154233293146191/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:14 compute-0 python3.9[194395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:15 compute-0 python3.9[194516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994314.073764-260-263143710771476/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:15 compute-0 sshd-session[193431]: Invalid user admin from 185.156.73.233 port 49054
Nov 24 14:25:15 compute-0 sshd-session[193431]: Connection closed by invalid user admin 185.156.73.233 port 49054 [preauth]
Nov 24 14:25:15 compute-0 python3.9[194666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:16 compute-0 python3.9[194742]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:16 compute-0 python3.9[194892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:17 compute-0 python3.9[194968]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:18 compute-0 python3.9[195118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:18 compute-0 python3.9[195194]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:19 compute-0 sudo[195344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjrfvubkepizpzorechrmduevugdqdok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994318.70372-449-174280115606369/AnsiballZ_file.py'
Nov 24 14:25:19 compute-0 sudo[195344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:19 compute-0 python3.9[195346]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:19 compute-0 sudo[195344]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:19 compute-0 sudo[195496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goiqakhrdurfwsodyuugepeczprgbzhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994319.5228055-457-130772794863684/AnsiballZ_file.py'
Nov 24 14:25:19 compute-0 sudo[195496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:20 compute-0 python3.9[195498]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:20 compute-0 sudo[195496]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:20 compute-0 sudo[195648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkybshbaiakaeerazcdcdeofnikbueuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994320.3117263-465-56962019623062/AnsiballZ_file.py'
Nov 24 14:25:20 compute-0 sudo[195648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:20 compute-0 nova_compute[187118]: 2025-11-24 14:25:20.753 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:20 compute-0 nova_compute[187118]: 2025-11-24 14:25:20.776 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:20 compute-0 python3.9[195650]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:25:20 compute-0 sudo[195648]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:21 compute-0 sudo[195800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klwzbudtwvefslxcfuyjodsmdlfcmtpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994321.0719514-473-137511147174404/AnsiballZ_systemd_service.py'
Nov 24 14:25:21 compute-0 sudo[195800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:21 compute-0 python3.9[195802]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:25:21 compute-0 systemd[1]: Reloading.
Nov 24 14:25:21 compute-0 systemd-rc-local-generator[195832]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:21 compute-0 systemd-sysv-generator[195835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:22 compute-0 systemd[1]: Listening on Podman API Socket.
Nov 24 14:25:22 compute-0 sudo[195800]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:22 compute-0 sudo[195991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrkppzugeimdoalxzzabrijdmijbuuep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994322.4970536-482-245325652757537/AnsiballZ_stat.py'
Nov 24 14:25:22 compute-0 sudo[195991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:22 compute-0 python3.9[195993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:22 compute-0 sudo[195991]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:23 compute-0 sudo[196114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlwnhjztfnhcqeiydkmnndrqguvedzpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994322.4970536-482-245325652757537/AnsiballZ_copy.py'
Nov 24 14:25:23 compute-0 sudo[196114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:23 compute-0 python3.9[196116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994322.4970536-482-245325652757537/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:25:23 compute-0 sudo[196114]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:23 compute-0 sudo[196190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjmtmgoxosgkgxvuualpdnxqzeannrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994322.4970536-482-245325652757537/AnsiballZ_stat.py'
Nov 24 14:25:23 compute-0 sudo[196190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:24 compute-0 python3.9[196192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:24 compute-0 sudo[196190]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:24 compute-0 sudo[196313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luzefrzjprmhredenabduocpikxbgpvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994322.4970536-482-245325652757537/AnsiballZ_copy.py'
Nov 24 14:25:24 compute-0 sudo[196313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:24 compute-0 python3.9[196315]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994322.4970536-482-245325652757537/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:25:24 compute-0 sudo[196313]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:25 compute-0 sudo[196465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnlwcojancwjoxgbrtngqaqxkgpfmffg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994325.0172586-510-116560067158600/AnsiballZ_container_config_data.py'
Nov 24 14:25:25 compute-0 sudo[196465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:25 compute-0 python3.9[196467]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 24 14:25:25 compute-0 sudo[196465]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:26 compute-0 sudo[196617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adsjgdrbqogusdmqwsegdznzaliqsuyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994326.0398664-519-158404111701630/AnsiballZ_container_config_hash.py'
Nov 24 14:25:26 compute-0 sudo[196617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:26 compute-0 python3.9[196619]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:25:26 compute-0 sudo[196617]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:27 compute-0 sudo[196769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucrbaddevwawtuigwtuxszmbrpuqfpeh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994327.0428526-529-70493947333874/AnsiballZ_edpm_container_manage.py'
Nov 24 14:25:27 compute-0 sudo[196769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:27 compute-0 python3[196771]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:25:28 compute-0 podman[196806]: 2025-11-24 14:25:28.06354625 +0000 UTC m=+0.062635934 container create f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 24 14:25:28 compute-0 podman[196806]: 2025-11-24 14:25:28.020009306 +0000 UTC m=+0.019099020 image pull 5b3bac081df6146e06acefa72320d250dc7d5f82abc7fbe0b9e83aec1e1587f5 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003
Nov 24 14:25:28 compute-0 python3[196771]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003 kolla_start
Nov 24 14:25:28 compute-0 sudo[196769]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:28 compute-0 sudo[196993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jykbzxvaknptxoatjsgthddcmdmutett ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994328.319191-537-83030237647790/AnsiballZ_stat.py'
Nov 24 14:25:28 compute-0 sudo[196993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:28 compute-0 python3.9[196995]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:25:28 compute-0 sudo[196993]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:29 compute-0 sudo[197147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qustkvvxpfpzlbhtgwbdjzotejfhojpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994329.0653718-546-257503991339726/AnsiballZ_file.py'
Nov 24 14:25:29 compute-0 sudo[197147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:29 compute-0 python3.9[197149]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:29 compute-0 sudo[197147]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:30 compute-0 sudo[197315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzqcgljwfwpzvdkhzxosjtsbmfrwvicm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994329.621645-546-133183423239420/AnsiballZ_copy.py'
Nov 24 14:25:30 compute-0 podman[197272]: 2025-11-24 14:25:30.170000113 +0000 UTC m=+0.082657498 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:25:30 compute-0 sudo[197315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:30 compute-0 python3.9[197319]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763994329.621645-546-133183423239420/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:30 compute-0 sudo[197315]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:30 compute-0 sudo[197393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzppdopblpzdbkufelkeokpzxqqjerpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994329.621645-546-133183423239420/AnsiballZ_systemd.py'
Nov 24 14:25:30 compute-0 sudo[197393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:31 compute-0 python3.9[197395]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:25:31 compute-0 systemd[1]: Reloading.
Nov 24 14:25:31 compute-0 systemd-rc-local-generator[197423]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:31 compute-0 systemd-sysv-generator[197427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:31 compute-0 sudo[197393]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.799 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.800 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.801 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.801 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.811 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.811 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.812 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.812 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.812 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.812 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.813 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.813 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.813 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.841 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.841 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.842 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:25:31 compute-0 nova_compute[187118]: 2025-11-24 14:25:31.842 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:25:31 compute-0 sudo[197504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fazakftkxibeandnzuhbakwwedrxlroa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994329.621645-546-133183423239420/AnsiballZ_systemd.py'
Nov 24 14:25:31 compute-0 sudo[197504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.019 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.021 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6182MB free_disk=73.66693115234375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.021 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.021 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.081 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.081 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.115 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.131 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.133 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:25:32 compute-0 nova_compute[187118]: 2025-11-24 14:25:32.134 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:25:32 compute-0 python3.9[197506]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:25:32 compute-0 systemd[1]: Reloading.
Nov 24 14:25:32 compute-0 systemd-rc-local-generator[197536]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:32 compute-0 systemd-sysv-generator[197540]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:32 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 24 14:25:32 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:25:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:32 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.
Nov 24 14:25:32 compute-0 podman[197546]: 2025-11-24 14:25:32.776264225 +0000 UTC m=+0.141205728 container init f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + sudo -E kolla_set_configs
Nov 24 14:25:32 compute-0 podman[197560]: 2025-11-24 14:25:32.796142315 +0000 UTC m=+0.089637061 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: sudo: unable to send audit message: Operation not permitted
Nov 24 14:25:32 compute-0 sudo[197588]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 14:25:32 compute-0 sudo[197588]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:25:32 compute-0 sudo[197588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 14:25:32 compute-0 podman[197546]: 2025-11-24 14:25:32.803702044 +0000 UTC m=+0.168643537 container start f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 24 14:25:32 compute-0 podman[197546]: ceilometer_agent_compute
Nov 24 14:25:32 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 24 14:25:32 compute-0 sudo[197504]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Validating config file
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Copying service configuration files
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: INFO:__main__:Writing out command to execute
Nov 24 14:25:32 compute-0 podman[197589]: 2025-11-24 14:25:32.870451981 +0000 UTC m=+0.056451433 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:25:32 compute-0 sudo[197588]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: ++ cat /run_command
Nov 24 14:25:32 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-53d73b6705c9dfd7.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:25:32 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-53d73b6705c9dfd7.service: Failed with result 'exit-code'.
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + ARGS=
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + sudo kolla_copy_cacerts
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: sudo: unable to send audit message: Operation not permitted
Nov 24 14:25:32 compute-0 sudo[197617]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 14:25:32 compute-0 sudo[197617]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:25:32 compute-0 sudo[197617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 14:25:32 compute-0 sudo[197617]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + [[ ! -n '' ]]
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + . kolla_extend_start
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + umask 0022
Nov 24 14:25:32 compute-0 ceilometer_agent_compute[197563]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 24 14:25:33 compute-0 sudo[197764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfikmcjqorjsglopxtgeksshvpuyeqds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994333.0220952-570-208346371683484/AnsiballZ_systemd.py'
Nov 24 14:25:33 compute-0 sudo[197764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:33 compute-0 python3.9[197766]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:25:33 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 24 14:25:33 compute-0 systemd[1]: libpod-f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.scope: Deactivated successfully.
Nov 24 14:25:33 compute-0 podman[197770]: 2025-11-24 14:25:33.700323883 +0000 UTC m=+0.053399559 container died f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:25:33 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-53d73b6705c9dfd7.timer: Deactivated successfully.
Nov 24 14:25:33 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.
Nov 24 14:25:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-userdata-shm.mount: Deactivated successfully.
Nov 24 14:25:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0-merged.mount: Deactivated successfully.
Nov 24 14:25:33 compute-0 podman[197770]: 2025-11-24 14:25:33.742621903 +0000 UTC m=+0.095697589 container cleanup f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 14:25:33 compute-0 podman[197770]: ceilometer_agent_compute
Nov 24 14:25:33 compute-0 podman[197795]: ceilometer_agent_compute
Nov 24 14:25:33 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 24 14:25:33 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 24 14:25:33 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Nov 24 14:25:33 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141c1eb47c0b0aeab8d1665ff10fce7eb463f1578d6eff5ddd2a9929943d5aa0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:33 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.
Nov 24 14:25:33 compute-0 podman[197808]: 2025-11-24 14:25:33.960488161 +0000 UTC m=+0.119181468 container init f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:25:33 compute-0 ceilometer_agent_compute[197823]: + sudo -E kolla_set_configs
Nov 24 14:25:33 compute-0 ceilometer_agent_compute[197823]: sudo: unable to send audit message: Operation not permitted
Nov 24 14:25:33 compute-0 podman[197808]: 2025-11-24 14:25:33.994132802 +0000 UTC m=+0.152826139 container start f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 14:25:33 compute-0 sudo[197829]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Nov 24 14:25:33 compute-0 sudo[197829]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:25:33 compute-0 sudo[197829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 14:25:33 compute-0 podman[197808]: ceilometer_agent_compute
Nov 24 14:25:34 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Nov 24 14:25:34 compute-0 sudo[197764]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Validating config file
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Copying service configuration files
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: INFO:__main__:Writing out command to execute
Nov 24 14:25:34 compute-0 sudo[197829]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: ++ cat /run_command
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + ARGS=
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + sudo kolla_copy_cacerts
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: sudo: unable to send audit message: Operation not permitted
Nov 24 14:25:34 compute-0 sudo[197848]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Nov 24 14:25:34 compute-0 sudo[197848]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Nov 24 14:25:34 compute-0 sudo[197848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Nov 24 14:25:34 compute-0 sudo[197848]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + [[ ! -n '' ]]
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + . kolla_extend_start
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + umask 0022
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 24 14:25:34 compute-0 podman[197830]: 2025-11-24 14:25:34.092660828 +0000 UTC m=+0.085990760 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 24 14:25:34 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-6b54f4227ad39b9.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:25:34 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-6b54f4227ad39b9.service: Failed with result 'exit-code'.
Nov 24 14:25:34 compute-0 sudo[198002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpqdrzputwcphpuaftoqwruhlsxtqstw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994334.2603717-578-67834038036770/AnsiballZ_stat.py'
Nov 24 14:25:34 compute-0 sudo[198002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:34 compute-0 python3.9[198004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:34 compute-0 sudo[198002]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.893 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.893 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.893 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.893 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.894 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.895 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.896 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.897 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.898 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.903 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.909 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.927 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.929 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 24 14:25:34 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:34.930 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.015 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.094 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.094 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.095 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.096 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.097 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.098 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.099 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.100 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.101 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.102 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.103 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.104 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.105 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.106 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.107 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.108 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.109 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.110 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.111 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.112 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.113 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.114 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.115 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.116 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.117 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.118 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.118 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.118 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.120 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.125 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:25:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:25:35 compute-0 sudo[198131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laccfixmmdkwitqgsznkltkkifcvmszp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994334.2603717-578-67834038036770/AnsiballZ_copy.py'
Nov 24 14:25:35 compute-0 sudo[198131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:35 compute-0 python3.9[198133]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994334.2603717-578-67834038036770/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:25:35 compute-0 sudo[198131]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:35 compute-0 sudo[198283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iayfolbxeobswcdelwihgvjyhmpgrlby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994335.6907709-595-186066844143010/AnsiballZ_container_config_data.py'
Nov 24 14:25:35 compute-0 sudo[198283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:36 compute-0 python3.9[198285]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 24 14:25:36 compute-0 sudo[198283]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:36 compute-0 sudo[198435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vklaceqtcksjoyvzopdrndinvccohrtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994336.4711645-604-96561054408767/AnsiballZ_container_config_hash.py'
Nov 24 14:25:36 compute-0 sudo[198435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:36 compute-0 python3.9[198437]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:25:36 compute-0 sudo[198435]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:37 compute-0 sudo[198587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdxdynopoougszosdzfmyqzgnabvhqjn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994337.3104134-614-133593278148379/AnsiballZ_edpm_container_manage.py'
Nov 24 14:25:37 compute-0 sudo[198587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:37 compute-0 python3[198589]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:25:38 compute-0 podman[198626]: 2025-11-24 14:25:38.154482934 +0000 UTC m=+0.042212959 container create eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:25:38 compute-0 podman[198626]: 2025-11-24 14:25:38.131489938 +0000 UTC m=+0.019219913 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Nov 24 14:25:38 compute-0 python3[198589]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 24 14:25:38 compute-0 sudo[198587]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:38 compute-0 sudo[198814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tecyxxqubdixvuglofnsmjskuyqvglre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994338.464186-622-21471267921437/AnsiballZ_stat.py'
Nov 24 14:25:38 compute-0 sudo[198814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:39 compute-0 python3.9[198816]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:25:39 compute-0 sudo[198814]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:39 compute-0 sudo[198968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpppoourmhqbimowhkyooyuzniqxspyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994339.2774556-631-28800928993491/AnsiballZ_file.py'
Nov 24 14:25:39 compute-0 sudo[198968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:39 compute-0 python3.9[198970]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:39 compute-0 sudo[198968]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:40 compute-0 sudo[199119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nffkwcvrmdhduhtwefrjpxvdebecswtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994339.8513703-631-34941331312436/AnsiballZ_copy.py'
Nov 24 14:25:40 compute-0 sudo[199119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:40 compute-0 python3.9[199121]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763994339.8513703-631-34941331312436/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:40 compute-0 sudo[199119]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:40 compute-0 sudo[199206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdleutkdcurqdnqaskwuegqnmkjtdqav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994339.8513703-631-34941331312436/AnsiballZ_systemd.py'
Nov 24 14:25:40 compute-0 sudo[199206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:40 compute-0 podman[199169]: 2025-11-24 14:25:40.932066596 +0000 UTC m=+0.123897534 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 24 14:25:41 compute-0 python3.9[199210]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:25:41 compute-0 systemd[1]: Reloading.
Nov 24 14:25:41 compute-0 systemd-rc-local-generator[199252]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:41 compute-0 systemd-sysv-generator[199255]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:41 compute-0 sudo[199206]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:41 compute-0 sudo[199333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kimikdlkotbalejqnxfpsxwayfowvfuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994339.8513703-631-34941331312436/AnsiballZ_systemd.py'
Nov 24 14:25:41 compute-0 sudo[199333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:42 compute-0 python3.9[199335]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:25:42 compute-0 systemd[1]: Reloading.
Nov 24 14:25:42 compute-0 systemd-sysv-generator[199362]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:42 compute-0 systemd-rc-local-generator[199359]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:42 compute-0 systemd[1]: Starting node_exporter container...
Nov 24 14:25:42 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:25:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13f9b5a673d47d1ddf098f21c941e33ae0485489109a03c39d432d915062942/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13f9b5a673d47d1ddf098f21c941e33ae0485489109a03c39d432d915062942/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:42 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.
Nov 24 14:25:42 compute-0 podman[199375]: 2025-11-24 14:25:42.598195326 +0000 UTC m=+0.177468569 container init eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.614Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.614Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.614Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.614Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.614Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=arp
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=bcache
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=bonding
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=cpu
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=edac
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=filefd
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=netclass
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=netdev
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=netstat
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=nfs
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=nvme
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=softnet
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=systemd
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=xfs
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.615Z caller=node_exporter.go:117 level=info collector=zfs
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.616Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 24 14:25:42 compute-0 node_exporter[199390]: ts=2025-11-24T14:25:42.617Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 24 14:25:42 compute-0 podman[199375]: 2025-11-24 14:25:42.628961606 +0000 UTC m=+0.208234859 container start eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:25:42 compute-0 podman[199375]: node_exporter
Nov 24 14:25:42 compute-0 systemd[1]: Started node_exporter container.
Nov 24 14:25:42 compute-0 sudo[199333]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:42 compute-0 podman[199399]: 2025-11-24 14:25:42.724578908 +0000 UTC m=+0.076440978 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:25:43 compute-0 sudo[199573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkwbjwkbqcqolysqqbbnrdwvvhqoioqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994342.8838103-655-262761223216356/AnsiballZ_systemd.py'
Nov 24 14:25:43 compute-0 sudo[199573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:43 compute-0 python3.9[199575]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:25:43 compute-0 systemd[1]: Stopping node_exporter container...
Nov 24 14:25:43 compute-0 systemd[1]: libpod-eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.scope: Deactivated successfully.
Nov 24 14:25:43 compute-0 podman[199579]: 2025-11-24 14:25:43.679634342 +0000 UTC m=+0.070108591 container died eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:25:43 compute-0 systemd[1]: eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d-7374ae10f674f882.timer: Deactivated successfully.
Nov 24 14:25:43 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.
Nov 24 14:25:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d-userdata-shm.mount: Deactivated successfully.
Nov 24 14:25:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-b13f9b5a673d47d1ddf098f21c941e33ae0485489109a03c39d432d915062942-merged.mount: Deactivated successfully.
Nov 24 14:25:43 compute-0 podman[199579]: 2025-11-24 14:25:43.745313057 +0000 UTC m=+0.135787296 container cleanup eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:25:43 compute-0 podman[199579]: node_exporter
Nov 24 14:25:43 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 14:25:43 compute-0 podman[199607]: node_exporter
Nov 24 14:25:43 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 24 14:25:43 compute-0 systemd[1]: Stopped node_exporter container.
Nov 24 14:25:43 compute-0 systemd[1]: Starting node_exporter container...
Nov 24 14:25:43 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:25:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13f9b5a673d47d1ddf098f21c941e33ae0485489109a03c39d432d915062942/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b13f9b5a673d47d1ddf098f21c941e33ae0485489109a03c39d432d915062942/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.
Nov 24 14:25:43 compute-0 podman[199620]: 2025-11-24 14:25:43.978673196 +0000 UTC m=+0.122720590 container init eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.989Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.989Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.989Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.989Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=arp
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=bcache
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=bonding
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=cpu
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=edac
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=filefd
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=netclass
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=netdev
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=netstat
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=nfs
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=nvme
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=softnet
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=systemd
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=xfs
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.990Z caller=node_exporter.go:117 level=info collector=zfs
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.991Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 24 14:25:43 compute-0 node_exporter[199635]: ts=2025-11-24T14:25:43.992Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 24 14:25:44 compute-0 podman[199620]: 2025-11-24 14:25:44.003427028 +0000 UTC m=+0.147474392 container start eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:25:44 compute-0 podman[199620]: node_exporter
Nov 24 14:25:44 compute-0 systemd[1]: Started node_exporter container.
Nov 24 14:25:44 compute-0 sudo[199573]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:44 compute-0 podman[199644]: 2025-11-24 14:25:44.065887173 +0000 UTC m=+0.053099885 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:25:44 compute-0 sudo[199820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyjfjhipqbvdaobkptdsgyssurflaeaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994344.2022781-663-177067062297010/AnsiballZ_stat.py'
Nov 24 14:25:44 compute-0 sudo[199820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:44 compute-0 python3.9[199822]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:44 compute-0 sudo[199820]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:45 compute-0 sudo[199943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmhzsmufnihlzsspzvzvcdjcsswzlavc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994344.2022781-663-177067062297010/AnsiballZ_copy.py'
Nov 24 14:25:45 compute-0 sudo[199943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:45 compute-0 python3.9[199945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994344.2022781-663-177067062297010/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:25:45 compute-0 sudo[199943]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:46 compute-0 sudo[200095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqyuqivaphutoymuxmfokytedptjxbin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994345.7636344-680-52838037548292/AnsiballZ_container_config_data.py'
Nov 24 14:25:46 compute-0 sudo[200095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:46 compute-0 python3.9[200097]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 24 14:25:46 compute-0 sudo[200095]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:46 compute-0 sudo[200247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crliymernvitacnibvpsjwdjrrsnzfqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994346.565689-689-94768963946816/AnsiballZ_container_config_hash.py'
Nov 24 14:25:46 compute-0 sudo[200247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:47 compute-0 python3.9[200249]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:25:47 compute-0 sudo[200247]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:47 compute-0 sudo[200399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiyhqauvmjpqwxofhnwmhmhbfhvyeotq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994347.404755-699-148605997651964/AnsiballZ_edpm_container_manage.py'
Nov 24 14:25:47 compute-0 sudo[200399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:47 compute-0 python3[200401]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:25:49 compute-0 podman[200415]: 2025-11-24 14:25:49.505403001 +0000 UTC m=+1.435403325 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 24 14:25:49 compute-0 podman[200511]: 2025-11-24 14:25:49.729569335 +0000 UTC m=+0.100106239 container create 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Nov 24 14:25:49 compute-0 podman[200511]: 2025-11-24 14:25:49.659957519 +0000 UTC m=+0.030494473 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Nov 24 14:25:49 compute-0 python3[200401]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 24 14:25:49 compute-0 sudo[200399]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:50 compute-0 sudo[200698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwdusejgqzqdvwhzukulldhqfcxffltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994350.1794384-707-278032534225384/AnsiballZ_stat.py'
Nov 24 14:25:50 compute-0 sudo[200698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:50 compute-0 python3.9[200700]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:25:50 compute-0 sudo[200698]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:51 compute-0 sudo[200852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nklpgbtdkykprdnaasknbzrubzvmaclu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994350.9982119-716-87381496771691/AnsiballZ_file.py'
Nov 24 14:25:51 compute-0 sudo[200852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:51 compute-0 python3.9[200854]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:51 compute-0 sudo[200852]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:52 compute-0 sudo[201003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auavufziycoaxygihtgutryslfzoztnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994351.5633492-716-210557752408309/AnsiballZ_copy.py'
Nov 24 14:25:52 compute-0 sudo[201003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:52 compute-0 python3.9[201005]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763994351.5633492-716-210557752408309/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:25:52 compute-0 sudo[201003]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:52 compute-0 sudo[201079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snrxpxaufzcmutcwocamxrqsabouaowv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994351.5633492-716-210557752408309/AnsiballZ_systemd.py'
Nov 24 14:25:52 compute-0 sudo[201079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:52 compute-0 python3.9[201081]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:25:52 compute-0 systemd[1]: Reloading.
Nov 24 14:25:53 compute-0 systemd-rc-local-generator[201108]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:53 compute-0 systemd-sysv-generator[201111]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:53 compute-0 sudo[201079]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:53 compute-0 sudo[201190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otbcrblavzjhfmaoakhiaftlwtilmsxw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994351.5633492-716-210557752408309/AnsiballZ_systemd.py'
Nov 24 14:25:53 compute-0 sudo[201190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:53 compute-0 python3.9[201192]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:25:53 compute-0 systemd[1]: Reloading.
Nov 24 14:25:54 compute-0 systemd-rc-local-generator[201221]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:25:54 compute-0 systemd-sysv-generator[201225]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:25:54 compute-0 systemd[1]: Starting podman_exporter container...
Nov 24 14:25:54 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:25:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd1912e316c1ccc872a62da22f0e363f05ef9aa62e14a7564c5b34e1609c6db1/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd1912e316c1ccc872a62da22f0e363f05ef9aa62e14a7564c5b34e1609c6db1/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:54 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.
Nov 24 14:25:54 compute-0 podman[201232]: 2025-11-24 14:25:54.384417109 +0000 UTC m=+0.155397794 container init 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.399Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.399Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.399Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.399Z caller=handler.go:105 level=info collector=container
Nov 24 14:25:54 compute-0 podman[201232]: 2025-11-24 14:25:54.411266128 +0000 UTC m=+0.182246803 container start 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 14:25:54 compute-0 systemd[1]: Starting Podman API Service...
Nov 24 14:25:54 compute-0 systemd[1]: Started Podman API Service.
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="Setting parallel job count to 25"
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="Using sqlite as database backend"
Nov 24 14:25:54 compute-0 podman[201232]: podman_exporter
Nov 24 14:25:54 compute-0 systemd[1]: Started podman_exporter container.
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 24 14:25:54 compute-0 podman[201259]: @ - - [24/Nov/2025:14:25:54 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 24 14:25:54 compute-0 podman[201259]: time="2025-11-24T14:25:54Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 14:25:54 compute-0 sudo[201190]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:54 compute-0 podman[201257]: 2025-11-24 14:25:54.506566811 +0000 UTC m=+0.087706181 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:25:54 compute-0 podman[201259]: @ - - [24/Nov/2025:14:25:54 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20486 "" "Go-http-client/1.1"
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.509Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.510Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 24 14:25:54 compute-0 systemd[1]: 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440-53d13bdaa4cd8275.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:25:54 compute-0 podman_exporter[201248]: ts=2025-11-24T14:25:54.510Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 24 14:25:54 compute-0 systemd[1]: 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440-53d13bdaa4cd8275.service: Failed with result 'exit-code'.
Nov 24 14:25:54 compute-0 sudo[201441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbxjrxbbhkduongjszabdglkefvlvxje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994354.6892836-740-228071144210255/AnsiballZ_systemd.py'
Nov 24 14:25:54 compute-0 sudo[201441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:55 compute-0 python3.9[201443]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:25:55 compute-0 systemd[1]: Stopping podman_exporter container...
Nov 24 14:25:55 compute-0 podman[201259]: @ - - [24/Nov/2025:14:25:54 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1641 "" "Go-http-client/1.1"
Nov 24 14:25:55 compute-0 systemd[1]: libpod-14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.scope: Deactivated successfully.
Nov 24 14:25:55 compute-0 podman[201447]: 2025-11-24 14:25:55.359957345 +0000 UTC m=+0.082668322 container died 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:25:55 compute-0 systemd[1]: 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440-53d13bdaa4cd8275.timer: Deactivated successfully.
Nov 24 14:25:55 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.
Nov 24 14:25:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440-userdata-shm.mount: Deactivated successfully.
Nov 24 14:25:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-fd1912e316c1ccc872a62da22f0e363f05ef9aa62e14a7564c5b34e1609c6db1-merged.mount: Deactivated successfully.
Nov 24 14:25:55 compute-0 podman[201447]: 2025-11-24 14:25:55.597861422 +0000 UTC m=+0.320572399 container cleanup 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:25:55 compute-0 podman[201447]: podman_exporter
Nov 24 14:25:55 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 14:25:55 compute-0 podman[201473]: podman_exporter
Nov 24 14:25:55 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 24 14:25:55 compute-0 systemd[1]: Stopped podman_exporter container.
Nov 24 14:25:55 compute-0 systemd[1]: Starting podman_exporter container...
Nov 24 14:25:55 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:25:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd1912e316c1ccc872a62da22f0e363f05ef9aa62e14a7564c5b34e1609c6db1/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd1912e316c1ccc872a62da22f0e363f05ef9aa62e14a7564c5b34e1609c6db1/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:25:55 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.
Nov 24 14:25:55 compute-0 podman[201486]: 2025-11-24 14:25:55.787103468 +0000 UTC m=+0.099433718 container init 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.802Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.802Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.802Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.802Z caller=handler.go:105 level=info collector=container
Nov 24 14:25:55 compute-0 podman[201259]: @ - - [24/Nov/2025:14:25:55 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 24 14:25:55 compute-0 podman[201259]: time="2025-11-24T14:25:55Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 24 14:25:55 compute-0 podman[201486]: 2025-11-24 14:25:55.809831024 +0000 UTC m=+0.122161214 container start 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:25:55 compute-0 podman[201486]: podman_exporter
Nov 24 14:25:55 compute-0 systemd[1]: Started podman_exporter container.
Nov 24 14:25:55 compute-0 podman[201259]: @ - - [24/Nov/2025:14:25:55 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 20488 "" "Go-http-client/1.1"
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.821Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.821Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 24 14:25:55 compute-0 podman_exporter[201501]: ts=2025-11-24T14:25:55.823Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 24 14:25:55 compute-0 sudo[201441]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:55 compute-0 podman[201510]: 2025-11-24 14:25:55.88343309 +0000 UTC m=+0.057452346 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 14:25:56 compute-0 sudo[201684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buvbkzpdwimbxsxcgoqbtdofxynkdhur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994356.0882008-748-261115098858874/AnsiballZ_stat.py'
Nov 24 14:25:56 compute-0 sudo[201684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:25:56.650 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:25:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:25:56.650 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:25:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:25:56.651 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:25:56 compute-0 python3.9[201686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:25:56 compute-0 sudo[201684]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:56 compute-0 auditd[703]: Audit daemon rotating log files
Nov 24 14:25:57 compute-0 sudo[201807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzmejvsrucxahsrtxryutzpssnkslyqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994356.0882008-748-261115098858874/AnsiballZ_copy.py'
Nov 24 14:25:57 compute-0 sudo[201807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:57 compute-0 python3.9[201809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1763994356.0882008-748-261115098858874/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 24 14:25:57 compute-0 sudo[201807]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:58 compute-0 sudo[201959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcodsjhldehcjuyruayvlbfvhtcrgvih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994357.7122245-765-120749911777331/AnsiballZ_container_config_data.py'
Nov 24 14:25:58 compute-0 sudo[201959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:58 compute-0 python3.9[201961]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 24 14:25:58 compute-0 sudo[201959]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:58 compute-0 sudo[202111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmqjylkqsnligpqkkrwspagvqsxljwdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994358.5309043-774-232150640993237/AnsiballZ_container_config_hash.py'
Nov 24 14:25:58 compute-0 sudo[202111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:59 compute-0 python3.9[202113]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 24 14:25:59 compute-0 sudo[202111]: pam_unix(sudo:session): session closed for user root
Nov 24 14:25:59 compute-0 sudo[202263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnsuqpfagithlsslpunsfdjgjxbazysw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994359.3062527-784-260666963561770/AnsiballZ_edpm_container_manage.py'
Nov 24 14:25:59 compute-0 sudo[202263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:25:59 compute-0 python3[202265]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 24 14:26:00 compute-0 podman[202290]: 2025-11-24 14:26:00.450173263 +0000 UTC m=+0.063299360 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:26:02 compute-0 podman[202277]: 2025-11-24 14:26:02.423064394 +0000 UTC m=+2.396977850 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 24 14:26:02 compute-0 podman[202393]: 2025-11-24 14:26:02.542246054 +0000 UTC m=+0.040219035 container create 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git)
Nov 24 14:26:02 compute-0 podman[202393]: 2025-11-24 14:26:02.520904048 +0000 UTC m=+0.018877069 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 24 14:26:02 compute-0 python3[202265]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Nov 24 14:26:02 compute-0 sudo[202263]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:03 compute-0 sudo[202592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yceipotnnonkeduirhaspryngivmvcdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994362.8517554-792-83287494584248/AnsiballZ_stat.py'
Nov 24 14:26:03 compute-0 sudo[202592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:03 compute-0 podman[202555]: 2025-11-24 14:26:03.161362532 +0000 UTC m=+0.067593139 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:26:03 compute-0 python3.9[202600]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:26:03 compute-0 sudo[202592]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:03 compute-0 sudo[202755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zywlgwlecfzxiptbkimjwsmajeuvlmyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994363.6079648-801-256213186276181/AnsiballZ_file.py'
Nov 24 14:26:03 compute-0 sudo[202755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:04 compute-0 python3.9[202757]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:04 compute-0 sudo[202755]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:04 compute-0 podman[202831]: 2025-11-24 14:26:04.435918403 +0000 UTC m=+0.040886234 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:26:04 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-6b54f4227ad39b9.service: Main process exited, code=exited, status=1/FAILURE
Nov 24 14:26:04 compute-0 systemd[1]: f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d-6b54f4227ad39b9.service: Failed with result 'exit-code'.
Nov 24 14:26:04 compute-0 sudo[202922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjwmrifibnejmkxvykpixgnvndhnypee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994364.1757817-801-92794671819302/AnsiballZ_copy.py'
Nov 24 14:26:04 compute-0 sudo[202922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:04 compute-0 python3.9[202924]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1763994364.1757817-801-92794671819302/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:04 compute-0 sudo[202922]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:04 compute-0 sudo[202998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfjbursvqtnlfbonpgdwsitcsmlazjib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994364.1757817-801-92794671819302/AnsiballZ_systemd.py'
Nov 24 14:26:04 compute-0 sudo[202998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:05 compute-0 python3.9[203000]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 24 14:26:05 compute-0 systemd[1]: Reloading.
Nov 24 14:26:05 compute-0 systemd-rc-local-generator[203028]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:26:05 compute-0 systemd-sysv-generator[203031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:26:05 compute-0 sudo[202998]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:05 compute-0 sudo[203109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjalwcvinaqbtyxszupnyvsolmerakuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994364.1757817-801-92794671819302/AnsiballZ_systemd.py'
Nov 24 14:26:05 compute-0 sudo[203109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:06 compute-0 python3.9[203111]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 24 14:26:06 compute-0 systemd[1]: Reloading.
Nov 24 14:26:06 compute-0 systemd-rc-local-generator[203138]: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 24 14:26:06 compute-0 systemd-sysv-generator[203142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 24 14:26:06 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 24 14:26:06 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:26:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 14:26:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:26:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:26:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.
Nov 24 14:26:06 compute-0 podman[203150]: 2025-11-24 14:26:06.557960881 +0000 UTC m=+0.106234039 container init 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *bridge.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *coverage.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *datapath.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *iface.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *memory.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *ovnnorthd.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *ovn.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *ovsdbserver.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *pmd_perf.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *pmd_rxq.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: INFO    14:26:06 main.go:48: registering *vswitch.Collector
Nov 24 14:26:06 compute-0 openstack_network_exporter[203166]: NOTICE  14:26:06 main.go:76: listening on https://:9105/metrics
Nov 24 14:26:06 compute-0 podman[203150]: 2025-11-24 14:26:06.586724426 +0000 UTC m=+0.134997554 container start 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Nov 24 14:26:06 compute-0 podman[203150]: openstack_network_exporter
Nov 24 14:26:06 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 24 14:26:06 compute-0 sudo[203109]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:06 compute-0 podman[203176]: 2025-11-24 14:26:06.662906424 +0000 UTC m=+0.066986803 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Nov 24 14:26:07 compute-0 sudo[203348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wikuentvqbtapygrclilzjqujmhtmixh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994366.8164947-825-16977499088727/AnsiballZ_systemd.py'
Nov 24 14:26:07 compute-0 sudo[203348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:07 compute-0 python3.9[203350]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 24 14:26:07 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Nov 24 14:26:07 compute-0 systemd[1]: libpod-6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.scope: Deactivated successfully.
Nov 24 14:26:07 compute-0 podman[203354]: 2025-11-24 14:26:07.501578696 +0000 UTC m=+0.047992602 container died 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Nov 24 14:26:07 compute-0 systemd[1]: 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1-36d4b24e797af835.timer: Deactivated successfully.
Nov 24 14:26:07 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.
Nov 24 14:26:07 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1-userdata-shm.mount: Deactivated successfully.
Nov 24 14:26:07 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0-merged.mount: Deactivated successfully.
Nov 24 14:26:08 compute-0 podman[203354]: 2025-11-24 14:26:08.487674386 +0000 UTC m=+1.034088322 container cleanup 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Nov 24 14:26:08 compute-0 podman[203354]: openstack_network_exporter
Nov 24 14:26:08 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 24 14:26:08 compute-0 podman[203381]: openstack_network_exporter
Nov 24 14:26:08 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 24 14:26:08 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Nov 24 14:26:08 compute-0 systemd[1]: Starting openstack_network_exporter container...
Nov 24 14:26:08 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:26:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 24 14:26:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 24 14:26:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb2ed6821d64ee149af42185c1c35370880c3bcb95a334d799928794501f5ae0/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 24 14:26:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.
Nov 24 14:26:08 compute-0 podman[203394]: 2025-11-24 14:26:08.766914228 +0000 UTC m=+0.168559620 container init 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *bridge.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *coverage.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *datapath.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *iface.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *memory.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *ovnnorthd.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *ovn.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *ovsdbserver.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *pmd_perf.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *pmd_rxq.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: INFO    14:26:08 main.go:48: registering *vswitch.Collector
Nov 24 14:26:08 compute-0 openstack_network_exporter[203409]: NOTICE  14:26:08 main.go:76: listening on https://:9105/metrics
Nov 24 14:26:08 compute-0 podman[203394]: 2025-11-24 14:26:08.797158983 +0000 UTC m=+0.198804335 container start 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=)
Nov 24 14:26:08 compute-0 podman[203394]: openstack_network_exporter
Nov 24 14:26:08 compute-0 systemd[1]: Started openstack_network_exporter container.
Nov 24 14:26:08 compute-0 sudo[203348]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:08 compute-0 podman[203419]: 2025-11-24 14:26:08.916436175 +0000 UTC m=+0.109067928 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9)
Nov 24 14:26:09 compute-0 sudo[203589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucakxovzruddwuwggvqlokwfjdgdwgiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994369.0315382-833-219755421701880/AnsiballZ_find.py'
Nov 24 14:26:09 compute-0 sudo[203589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:09 compute-0 python3.9[203591]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 24 14:26:09 compute-0 sudo[203589]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:10 compute-0 sudo[203741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aannigzsjrretrnkububkobdvgfxluxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994369.997952-843-132206630586301/AnsiballZ_podman_container_info.py'
Nov 24 14:26:10 compute-0 sudo[203741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:10 compute-0 python3.9[203743]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 24 14:26:10 compute-0 sudo[203741]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:11 compute-0 sudo[203921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mqdceyinredinyzqdsqjgzfiheouuicx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994371.0164642-851-164734842791345/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:11 compute-0 sudo[203921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:11 compute-0 podman[203861]: 2025-11-24 14:26:11.51472016 +0000 UTC m=+0.109495140 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:26:11 compute-0 python3.9[203926]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:11 compute-0 systemd[1]: Started libpod-conmon-48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff.scope.
Nov 24 14:26:11 compute-0 podman[203932]: 2025-11-24 14:26:11.786068472 +0000 UTC m=+0.070467531 container exec 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller)
Nov 24 14:26:11 compute-0 podman[203932]: 2025-11-24 14:26:11.794895798 +0000 UTC m=+0.079294857 container exec_died 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 24 14:26:11 compute-0 sudo[203921]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:11 compute-0 systemd[1]: libpod-conmon-48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff.scope: Deactivated successfully.
Nov 24 14:26:12 compute-0 sudo[204110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekrmwnczdxidbbksadhlakyurjjzksks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994371.992982-859-81851026168687/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:12 compute-0 sudo[204110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:12 compute-0 python3.9[204112]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:12 compute-0 systemd[1]: Started libpod-conmon-48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff.scope.
Nov 24 14:26:12 compute-0 podman[204113]: 2025-11-24 14:26:12.582905735 +0000 UTC m=+0.071874050 container exec 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:26:12 compute-0 podman[204113]: 2025-11-24 14:26:12.620118754 +0000 UTC m=+0.109087019 container exec_died 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:26:12 compute-0 systemd[1]: libpod-conmon-48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff.scope: Deactivated successfully.
Nov 24 14:26:12 compute-0 sudo[204110]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:13 compute-0 sudo[204293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dahymrejzrhinyfeemveczfiqwweupdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994372.8489034-867-117008733773107/AnsiballZ_file.py'
Nov 24 14:26:13 compute-0 sudo[204293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:13 compute-0 python3.9[204295]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:13 compute-0 sudo[204293]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:13 compute-0 sudo[204445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlzcbcqjuyvugfhlxgkvwtcwmnrasxtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994373.5861108-876-202095787167769/AnsiballZ_podman_container_info.py'
Nov 24 14:26:13 compute-0 sudo[204445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:14 compute-0 python3.9[204447]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 24 14:26:14 compute-0 sudo[204445]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:14 compute-0 podman[204516]: 2025-11-24 14:26:14.488441975 +0000 UTC m=+0.089757349 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:26:14 compute-0 sudo[204634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyglgwigtdjrxirddtefpohqggsijowf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994374.3316226-884-171655358179145/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:14 compute-0 sudo[204634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:14 compute-0 python3.9[204636]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:14 compute-0 systemd[1]: Started libpod-conmon-765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4.scope.
Nov 24 14:26:14 compute-0 podman[204637]: 2025-11-24 14:26:14.944450176 +0000 UTC m=+0.085694366 container exec 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:26:14 compute-0 podman[204637]: 2025-11-24 14:26:14.974312831 +0000 UTC m=+0.115557021 container exec_died 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:26:15 compute-0 systemd[1]: libpod-conmon-765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4.scope: Deactivated successfully.
Nov 24 14:26:15 compute-0 sudo[204634]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:15 compute-0 sudo[204818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcodunsqhqbuokeobuzayqqvezxckutz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994375.1946409-892-70919370507363/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:15 compute-0 sudo[204818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:15 compute-0 python3.9[204820]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:15 compute-0 systemd[1]: Started libpod-conmon-765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4.scope.
Nov 24 14:26:15 compute-0 podman[204821]: 2025-11-24 14:26:15.853272258 +0000 UTC m=+0.085137420 container exec 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:26:15 compute-0 podman[204821]: 2025-11-24 14:26:15.860359206 +0000 UTC m=+0.092224288 container exec_died 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:26:15 compute-0 sudo[204818]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:15 compute-0 systemd[1]: libpod-conmon-765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4.scope: Deactivated successfully.
Nov 24 14:26:16 compute-0 sudo[205000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usxliafzbvttxrzrmpefroachnxfxwoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994376.1104558-900-105277883987262/AnsiballZ_file.py'
Nov 24 14:26:16 compute-0 sudo[205000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:16 compute-0 python3.9[205002]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:16 compute-0 sudo[205000]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:17 compute-0 sudo[205152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmpoifgmaalbgpmtwuoiwhntyirwnveo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994376.8420742-909-110309952534766/AnsiballZ_podman_container_info.py'
Nov 24 14:26:17 compute-0 sudo[205152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:17 compute-0 python3.9[205154]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 24 14:26:17 compute-0 sudo[205152]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:17 compute-0 sudo[205317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgjjwbrlbwgbecldelwluviagqanwzzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994377.671-917-185013785389703/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:17 compute-0 sudo[205317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:18 compute-0 python3.9[205319]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:18 compute-0 systemd[1]: Started libpod-conmon-8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.scope.
Nov 24 14:26:18 compute-0 podman[205320]: 2025-11-24 14:26:18.310351627 +0000 UTC m=+0.093308177 container exec 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:26:18 compute-0 podman[205320]: 2025-11-24 14:26:18.345114139 +0000 UTC m=+0.128070669 container exec_died 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3)
Nov 24 14:26:18 compute-0 systemd[1]: libpod-conmon-8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.scope: Deactivated successfully.
Nov 24 14:26:18 compute-0 sudo[205317]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:18 compute-0 sudo[205500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivpowidgshdcvlfdknjhxnrxvptvlzqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994378.5498197-925-147021717975291/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:18 compute-0 sudo[205500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:19 compute-0 python3.9[205502]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:19 compute-0 systemd[1]: Started libpod-conmon-8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.scope.
Nov 24 14:26:19 compute-0 podman[205503]: 2025-11-24 14:26:19.148632529 +0000 UTC m=+0.091501997 container exec 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 14:26:19 compute-0 podman[205503]: 2025-11-24 14:26:19.185766767 +0000 UTC m=+0.128636215 container exec_died 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 14:26:19 compute-0 systemd[1]: libpod-conmon-8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c.scope: Deactivated successfully.
Nov 24 14:26:19 compute-0 sudo[205500]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:19 compute-0 sudo[205685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrocvjmqhqcfkiynakakbjayzpfcgpuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994379.3993664-933-185185049451717/AnsiballZ_file.py'
Nov 24 14:26:19 compute-0 sudo[205685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:19 compute-0 python3.9[205687]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:19 compute-0 sudo[205685]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:20 compute-0 sudo[205837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmquvsacgnfwbdphsukpgjymbbwamqcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994380.123009-942-17050241504118/AnsiballZ_podman_container_info.py'
Nov 24 14:26:20 compute-0 sudo[205837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:20 compute-0 python3.9[205839]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 24 14:26:20 compute-0 sudo[205837]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:21 compute-0 sudo[206002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbbwvzksejqnzzmgqzzjvrmaonpkqqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994380.877193-950-274378689706229/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:21 compute-0 sudo[206002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:21 compute-0 python3.9[206004]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:21 compute-0 systemd[1]: Started libpod-conmon-f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.scope.
Nov 24 14:26:21 compute-0 podman[206005]: 2025-11-24 14:26:21.528185603 +0000 UTC m=+0.089045019 container exec f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 24 14:26:21 compute-0 podman[206005]: 2025-11-24 14:26:21.537096981 +0000 UTC m=+0.097956387 container exec_died f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 24 14:26:21 compute-0 systemd[1]: libpod-conmon-f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.scope: Deactivated successfully.
Nov 24 14:26:21 compute-0 sudo[206002]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:22 compute-0 sudo[206186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxlwusvwyytldzjhsaafqqavsmwlxjbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994381.7542243-958-227084803555099/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:22 compute-0 sudo[206186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:22 compute-0 python3.9[206188]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:22 compute-0 systemd[1]: Started libpod-conmon-f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.scope.
Nov 24 14:26:22 compute-0 podman[206189]: 2025-11-24 14:26:22.370956049 +0000 UTC m=+0.089567693 container exec f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 14:26:22 compute-0 podman[206189]: 2025-11-24 14:26:22.402870821 +0000 UTC m=+0.121482465 container exec_died f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 24 14:26:22 compute-0 systemd[1]: libpod-conmon-f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d.scope: Deactivated successfully.
Nov 24 14:26:22 compute-0 sudo[206186]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:22 compute-0 sudo[206370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyzaeuguzcabujfdnkumodqqhmsxxfin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994382.6084392-966-183616945458543/AnsiballZ_file.py'
Nov 24 14:26:22 compute-0 sudo[206370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:23 compute-0 python3.9[206372]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:23 compute-0 sudo[206370]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:23 compute-0 sudo[206522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hutswkladrjuidtmrouurmwsuxaickhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994383.3600116-975-267888506774788/AnsiballZ_podman_container_info.py'
Nov 24 14:26:23 compute-0 sudo[206522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:23 compute-0 python3.9[206524]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 24 14:26:23 compute-0 sudo[206522]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:24 compute-0 sudo[206687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybeqimogdqnaxemzeegslbvzwtzprgbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994384.1201897-983-273099313659673/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:24 compute-0 sudo[206687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:24 compute-0 python3.9[206689]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:24 compute-0 systemd[1]: Started libpod-conmon-eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.scope.
Nov 24 14:26:24 compute-0 podman[206690]: 2025-11-24 14:26:24.760126762 +0000 UTC m=+0.094782739 container exec eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:26:24 compute-0 podman[206690]: 2025-11-24 14:26:24.79618621 +0000 UTC m=+0.130842187 container exec_died eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:26:24 compute-0 systemd[1]: libpod-conmon-eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.scope: Deactivated successfully.
Nov 24 14:26:24 compute-0 sudo[206687]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:25 compute-0 sudo[206871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwmhfjcmngcbabrfcujpgkbdjtgdkrsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994385.033498-991-239662961373075/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:25 compute-0 sudo[206871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:25 compute-0 python3.9[206873]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:25 compute-0 systemd[1]: Started libpod-conmon-eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.scope.
Nov 24 14:26:25 compute-0 podman[206874]: 2025-11-24 14:26:25.67023029 +0000 UTC m=+0.089423340 container exec eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:26:25 compute-0 podman[206874]: 2025-11-24 14:26:25.701216496 +0000 UTC m=+0.120409526 container exec_died eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:26:25 compute-0 systemd[1]: libpod-conmon-eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d.scope: Deactivated successfully.
Nov 24 14:26:25 compute-0 sudo[206871]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:26 compute-0 sudo[207069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykzmlwjtozerppboeergjnfpgsnejorp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994385.9571488-999-145855273825970/AnsiballZ_file.py'
Nov 24 14:26:26 compute-0 sudo[207069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:26 compute-0 podman[207027]: 2025-11-24 14:26:26.285258673 +0000 UTC m=+0.064060451 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:26:26 compute-0 python3.9[207080]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:26 compute-0 sudo[207069]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:27 compute-0 sudo[207230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbqiyprpdfnusttstliasprnujkbesf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994386.7202156-1008-133123517619084/AnsiballZ_podman_container_info.py'
Nov 24 14:26:27 compute-0 sudo[207230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:27 compute-0 python3.9[207232]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 24 14:26:27 compute-0 sudo[207230]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:27 compute-0 sudo[207395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-prqqzrbzspfmgirwjixbzcrajisaatzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994387.4772131-1016-254800396003212/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:27 compute-0 sudo[207395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:27 compute-0 python3.9[207397]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:28 compute-0 systemd[1]: Started libpod-conmon-14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.scope.
Nov 24 14:26:28 compute-0 podman[207398]: 2025-11-24 14:26:28.023038427 +0000 UTC m=+0.062619122 container exec 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:26:28 compute-0 podman[207417]: 2025-11-24 14:26:28.084837243 +0000 UTC m=+0.051653025 container exec_died 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:26:28 compute-0 podman[207398]: 2025-11-24 14:26:28.090222644 +0000 UTC m=+0.129803339 container exec_died 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:26:28 compute-0 systemd[1]: libpod-conmon-14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.scope: Deactivated successfully.
Nov 24 14:26:28 compute-0 sudo[207395]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:28 compute-0 sudo[207579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbpxequuptaytohiddhmpqoxjbcfycwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994388.2819304-1024-25076273862505/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:28 compute-0 sudo[207579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:28 compute-0 python3.9[207581]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:28 compute-0 systemd[1]: Started libpod-conmon-14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.scope.
Nov 24 14:26:28 compute-0 podman[207582]: 2025-11-24 14:26:28.87145705 +0000 UTC m=+0.086302301 container exec 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:26:28 compute-0 podman[207582]: 2025-11-24 14:26:28.900939904 +0000 UTC m=+0.115785155 container exec_died 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 14:26:28 compute-0 systemd[1]: libpod-conmon-14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440.scope: Deactivated successfully.
Nov 24 14:26:28 compute-0 sudo[207579]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:29 compute-0 sudo[207764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcmusdvcwvjlmmmmwtozwbckqtdkqwax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994389.1192443-1032-59558777812436/AnsiballZ_file.py'
Nov 24 14:26:29 compute-0 sudo[207764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:29 compute-0 python3.9[207766]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:29 compute-0 sudo[207764]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:30 compute-0 sudo[207916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hduqqxvbqhfylotljmtndwsuhivwouwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994389.9800346-1041-245220249990568/AnsiballZ_podman_container_info.py'
Nov 24 14:26:30 compute-0 sudo[207916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:30 compute-0 python3.9[207918]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 24 14:26:30 compute-0 sudo[207916]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:31 compute-0 sudo[208095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqomofugakjahaeexbbvrdjnhmcemikv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994390.8726203-1049-52652223222769/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:31 compute-0 sudo[208095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:31 compute-0 podman[208055]: 2025-11-24 14:26:31.238637798 +0000 UTC m=+0.091425704 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 14:26:31 compute-0 python3.9[208101]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:31 compute-0 systemd[1]: Started libpod-conmon-6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.scope.
Nov 24 14:26:31 compute-0 podman[208102]: 2025-11-24 14:26:31.536343987 +0000 UTC m=+0.096045315 container exec 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=)
Nov 24 14:26:31 compute-0 podman[208102]: 2025-11-24 14:26:31.571305163 +0000 UTC m=+0.131006551 container exec_died 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=)
Nov 24 14:26:31 compute-0 systemd[1]: libpod-conmon-6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.scope: Deactivated successfully.
Nov 24 14:26:31 compute-0 sudo[208095]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.124 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.125 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 sudo[208283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyahslilsnhgcdsfiludzjpkgipsvwav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994391.8110623-1057-269693088671511/AnsiballZ_podman_container_exec.py'
Nov 24 14:26:32 compute-0 sudo[208283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.138 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.138 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.139 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.148 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.148 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.149 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 python3.9[208285]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 24 14:26:32 compute-0 systemd[1]: Started libpod-conmon-6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.scope.
Nov 24 14:26:32 compute-0 podman[208286]: 2025-11-24 14:26:32.462497523 +0000 UTC m=+0.080485309 container exec 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 14:26:32 compute-0 podman[208286]: 2025-11-24 14:26:32.492876192 +0000 UTC m=+0.110863958 container exec_died 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Nov 24 14:26:32 compute-0 systemd[1]: libpod-conmon-6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1.scope: Deactivated successfully.
Nov 24 14:26:32 compute-0 sudo[208283]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 nova_compute[187118]: 2025-11-24 14:26:32.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:32 compute-0 sudo[208468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhlslwhvqxaqeqcbylrarqhjpvzythzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994392.678903-1065-80479044580485/AnsiballZ_file.py'
Nov 24 14:26:32 compute-0 sudo[208468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:33 compute-0 python3.9[208470]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:33 compute-0 sudo[208468]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:33 compute-0 podman[208502]: 2025-11-24 14:26:33.48173186 +0000 UTC m=+0.074044230 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:26:33 compute-0 sudo[208639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iagugbdrnoihtgcnrqhinmmuphlqobyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994393.3991044-1074-219160210161561/AnsiballZ_file.py'
Nov 24 14:26:33 compute-0 sudo[208639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.825 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.825 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.826 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.826 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:26:33 compute-0 python3.9[208641]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:33 compute-0 sudo[208639]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.963 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.964 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5929MB free_disk=73.49741744995117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.964 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:26:33 compute-0 nova_compute[187118]: 2025-11-24 14:26:33.964 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:26:34 compute-0 nova_compute[187118]: 2025-11-24 14:26:34.037 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:26:34 compute-0 nova_compute[187118]: 2025-11-24 14:26:34.037 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:26:34 compute-0 nova_compute[187118]: 2025-11-24 14:26:34.070 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:26:34 compute-0 nova_compute[187118]: 2025-11-24 14:26:34.107 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:26:34 compute-0 nova_compute[187118]: 2025-11-24 14:26:34.108 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:26:34 compute-0 nova_compute[187118]: 2025-11-24 14:26:34.109 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:26:34 compute-0 sudo[208791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqnvstjizktmdjnjkszogaybfvitglwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994394.054607-1082-79176747994550/AnsiballZ_stat.py'
Nov 24 14:26:34 compute-0 sudo[208791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:34 compute-0 python3.9[208793]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:34 compute-0 sudo[208791]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:34 compute-0 sudo[208927]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hngilcjpimwigcxlzsqanrcozamafgta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994394.054607-1082-79176747994550/AnsiballZ_copy.py'
Nov 24 14:26:34 compute-0 sudo[208927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:34 compute-0 podman[208888]: 2025-11-24 14:26:34.886754976 +0000 UTC m=+0.054955787 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 24 14:26:35 compute-0 python3.9[208936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1763994394.054607-1082-79176747994550/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:35 compute-0 sudo[208927]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:35 compute-0 sudo[209086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fudtgqqrdndfzkyylercxrusadoafgts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994395.3879066-1098-203141770528176/AnsiballZ_file.py'
Nov 24 14:26:35 compute-0 sudo[209086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:35 compute-0 python3.9[209088]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:35 compute-0 sudo[209086]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:36 compute-0 sudo[209238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mefymewpoqxtbacdpjtzutjxrntweczg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994396.083357-1106-240769849596862/AnsiballZ_stat.py'
Nov 24 14:26:36 compute-0 sudo[209238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:36 compute-0 python3.9[209240]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:36 compute-0 sudo[209238]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:36 compute-0 sudo[209316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ieciugrsepejueopibpdnnxkzxuqhenc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994396.083357-1106-240769849596862/AnsiballZ_file.py'
Nov 24 14:26:36 compute-0 sudo[209316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:37 compute-0 python3.9[209318]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:37 compute-0 sudo[209316]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:37 compute-0 sudo[209468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyntitlfzbumwedapksrgvoofkyqbniu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994397.2145567-1118-36677610914696/AnsiballZ_stat.py'
Nov 24 14:26:37 compute-0 sudo[209468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:37 compute-0 python3.9[209470]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:37 compute-0 sudo[209468]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:37 compute-0 sudo[209546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnzinhzmifbqhylyjisbuevqdnstgjbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994397.2145567-1118-36677610914696/AnsiballZ_file.py'
Nov 24 14:26:37 compute-0 sudo[209546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:38 compute-0 python3.9[209548]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qywuligt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:38 compute-0 sudo[209546]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:38 compute-0 sudo[209698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaxywdtavafwbyxvzliietvjwjcqzizu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994398.247989-1130-141125918708628/AnsiballZ_stat.py'
Nov 24 14:26:38 compute-0 sudo[209698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:38 compute-0 python3.9[209700]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:38 compute-0 sudo[209698]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:38 compute-0 sudo[209776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgstwbbtufbicdkrprluozvwxunpktwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994398.247989-1130-141125918708628/AnsiballZ_file.py'
Nov 24 14:26:38 compute-0 sudo[209776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:39 compute-0 python3.9[209778]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:39 compute-0 sudo[209776]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:39 compute-0 podman[209845]: 2025-11-24 14:26:39.478082786 +0000 UTC m=+0.084322897 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350)
Nov 24 14:26:39 compute-0 sudo[209949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnllmdqcqslczvyyhpaqyduxlocsenmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994399.3270261-1143-96611653216393/AnsiballZ_command.py'
Nov 24 14:26:39 compute-0 sudo[209949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:39 compute-0 python3.9[209951]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:26:39 compute-0 sudo[209949]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:40 compute-0 sudo[210102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmystznvprfprmbiagddtuytersumoxe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1763994400.0556571-1151-212674768730974/AnsiballZ_edpm_nftables_from_files.py'
Nov 24 14:26:40 compute-0 sudo[210102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:40 compute-0 python3[210104]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 24 14:26:40 compute-0 sudo[210102]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:41 compute-0 sudo[210254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdzbjuupxysabnhxlznugcvenxfofdof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994401.0647461-1159-20851624561643/AnsiballZ_stat.py'
Nov 24 14:26:41 compute-0 sudo[210254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:41 compute-0 python3.9[210256]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:41 compute-0 sudo[210254]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:41 compute-0 sudo[210343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzaclvvqnfeehomknzgzwrfwyyoupivh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994401.0647461-1159-20851624561643/AnsiballZ_file.py'
Nov 24 14:26:41 compute-0 sudo[210343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:41 compute-0 podman[210306]: 2025-11-24 14:26:41.979392312 +0000 UTC m=+0.078309419 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 14:26:42 compute-0 python3.9[210352]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:42 compute-0 sudo[210343]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:42 compute-0 sudo[210508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llaygcwuponncpeggkfzbrevsmayfnqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994402.3158116-1171-277549559142066/AnsiballZ_stat.py'
Nov 24 14:26:42 compute-0 sudo[210508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:42 compute-0 python3.9[210510]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:42 compute-0 sudo[210508]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:43 compute-0 sudo[210586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uocxiykjrkhlicyexydhglykpptyvvgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994402.3158116-1171-277549559142066/AnsiballZ_file.py'
Nov 24 14:26:43 compute-0 sudo[210586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:43 compute-0 python3.9[210588]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:43 compute-0 sudo[210586]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:43 compute-0 sudo[210738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqmozzyeobzlmnrlrhksjycrbfcfqwwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994403.62214-1183-199468365129253/AnsiballZ_stat.py'
Nov 24 14:26:43 compute-0 sudo[210738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:44 compute-0 python3.9[210740]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:44 compute-0 sudo[210738]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:44 compute-0 sudo[210816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctldwehcdnrotyjngdzdjjfuovgomujt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994403.62214-1183-199468365129253/AnsiballZ_file.py'
Nov 24 14:26:44 compute-0 sudo[210816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:44 compute-0 python3.9[210818]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:44 compute-0 sudo[210816]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:45 compute-0 sudo[210992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lglvqaxhlbmnkwejqpdnlbnnjlactaxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994404.8051133-1195-241332892550971/AnsiballZ_stat.py'
Nov 24 14:26:45 compute-0 sudo[210992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:45 compute-0 podman[210942]: 2025-11-24 14:26:45.187845325 +0000 UTC m=+0.075379397 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:26:45 compute-0 python3.9[210994]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:45 compute-0 sudo[210992]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:45 compute-0 sudo[211070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlvcfvgufssbguvlebyyqvbscdgpjwca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994404.8051133-1195-241332892550971/AnsiballZ_file.py'
Nov 24 14:26:45 compute-0 sudo[211070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:45 compute-0 python3.9[211072]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:45 compute-0 sudo[211070]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:46 compute-0 sudo[211222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epdbykpydxzvquryxksixxqxfcbnqvlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994406.123844-1207-153630331874023/AnsiballZ_stat.py'
Nov 24 14:26:46 compute-0 sudo[211222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:46 compute-0 python3.9[211224]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 24 14:26:46 compute-0 sudo[211222]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:47 compute-0 sudo[211347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gocsxcypdimdfajwgpnquiglzxhybkcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994406.123844-1207-153630331874023/AnsiballZ_copy.py'
Nov 24 14:26:47 compute-0 sudo[211347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:47 compute-0 python3.9[211349]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1763994406.123844-1207-153630331874023/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:47 compute-0 sudo[211347]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:47 compute-0 sudo[211499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beqgzzjfstwzsqbvnzaykldtxlpvucib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994407.5693967-1222-2512438145271/AnsiballZ_file.py'
Nov 24 14:26:47 compute-0 sudo[211499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:48 compute-0 python3.9[211501]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:48 compute-0 sudo[211499]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:48 compute-0 sudo[211651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxlaseacqonmrybmlfgnqefxsxxduzqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994408.3006766-1230-260203503916020/AnsiballZ_command.py'
Nov 24 14:26:48 compute-0 sudo[211651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:48 compute-0 python3.9[211653]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:26:48 compute-0 sudo[211651]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:49 compute-0 sudo[211806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlunkzazhiurfuzjcnmwmnywkzrjeakz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994409.0510595-1238-34224168893425/AnsiballZ_blockinfile.py'
Nov 24 14:26:49 compute-0 sudo[211806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:49 compute-0 python3.9[211808]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:49 compute-0 sudo[211806]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:50 compute-0 sudo[211958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooglveyutheahesejbameuahklmoeqrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994409.9615319-1247-211511511723212/AnsiballZ_command.py'
Nov 24 14:26:50 compute-0 sudo[211958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:50 compute-0 python3.9[211960]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:26:50 compute-0 sudo[211958]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:51 compute-0 sudo[212111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvjmsyflfndejmwxyjabgjrsgbvyipwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994410.6476486-1255-225260537650749/AnsiballZ_stat.py'
Nov 24 14:26:51 compute-0 sudo[212111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:51 compute-0 python3.9[212113]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 24 14:26:51 compute-0 sudo[212111]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:51 compute-0 sudo[212265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtfupubentwcyahvelnmdsxvgqhuqeqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994411.4990792-1263-266466023808057/AnsiballZ_command.py'
Nov 24 14:26:51 compute-0 sudo[212265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:52 compute-0 python3.9[212267]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 24 14:26:52 compute-0 sudo[212265]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:52 compute-0 sudo[212420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmfwvpagjgbnuxgykcoqlgauajzvrdrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1763994412.364928-1271-244329421269402/AnsiballZ_file.py'
Nov 24 14:26:52 compute-0 sudo[212420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:26:52 compute-0 python3.9[212422]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 24 14:26:52 compute-0 sudo[212420]: pam_unix(sudo:session): session closed for user root
Nov 24 14:26:53 compute-0 sshd-session[187437]: Connection closed by 192.168.122.30 port 38826
Nov 24 14:26:53 compute-0 sshd-session[187434]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:26:53 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Nov 24 14:26:53 compute-0 systemd[1]: session-26.scope: Consumed 1min 42.838s CPU time.
Nov 24 14:26:53 compute-0 systemd-logind[807]: Session 26 logged out. Waiting for processes to exit.
Nov 24 14:26:53 compute-0 systemd-logind[807]: Removed session 26.
Nov 24 14:26:56 compute-0 podman[212447]: 2025-11-24 14:26:56.482225067 +0000 UTC m=+0.076685814 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:26:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:26:56.652 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:26:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:26:56.652 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:26:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:26:56.652 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:27:01 compute-0 podman[212471]: 2025-11-24 14:27:01.443739978 +0000 UTC m=+0.053956111 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:27:04 compute-0 podman[212490]: 2025-11-24 14:27:04.486165919 +0000 UTC m=+0.079864701 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 14:27:05 compute-0 podman[212511]: 2025-11-24 14:27:05.444519936 +0000 UTC m=+0.054839235 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 14:27:10 compute-0 podman[212531]: 2025-11-24 14:27:10.438591521 +0000 UTC m=+0.051813502 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Nov 24 14:27:12 compute-0 podman[212553]: 2025-11-24 14:27:12.465864476 +0000 UTC m=+0.078149533 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:27:15 compute-0 podman[212579]: 2025-11-24 14:27:15.457372542 +0000 UTC m=+0.060153430 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:27:27 compute-0 podman[212605]: 2025-11-24 14:27:27.442935997 +0000 UTC m=+0.049185829 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:27:32 compute-0 nova_compute[187118]: 2025-11-24 14:27:32.109 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:32 compute-0 nova_compute[187118]: 2025-11-24 14:27:32.109 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:27:32 compute-0 nova_compute[187118]: 2025-11-24 14:27:32.109 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:27:32 compute-0 nova_compute[187118]: 2025-11-24 14:27:32.128 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:27:32 compute-0 nova_compute[187118]: 2025-11-24 14:27:32.128 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:32 compute-0 podman[212629]: 2025-11-24 14:27:32.456755363 +0000 UTC m=+0.065232359 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.828 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.829 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.829 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:27:33 compute-0 nova_compute[187118]: 2025-11-24 14:27:33.829 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.025 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.026 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6018MB free_disk=73.49697875976562GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.026 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.026 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.096 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.097 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.117 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.129 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.130 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:27:34 compute-0 nova_compute[187118]: 2025-11-24 14:27:34.131 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:27:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:27:35 compute-0 nova_compute[187118]: 2025-11-24 14:27:35.130 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:35 compute-0 nova_compute[187118]: 2025-11-24 14:27:35.130 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:27:35.368 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:27:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:27:35.369 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:27:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:27:35.370 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:27:35 compute-0 podman[212648]: 2025-11-24 14:27:35.453570433 +0000 UTC m=+0.061575620 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 24 14:27:35 compute-0 podman[212668]: 2025-11-24 14:27:35.529981097 +0000 UTC m=+0.048641334 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 14:27:35 compute-0 nova_compute[187118]: 2025-11-24 14:27:35.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:35 compute-0 nova_compute[187118]: 2025-11-24 14:27:35.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:27:35 compute-0 nova_compute[187118]: 2025-11-24 14:27:35.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:27:41 compute-0 podman[212688]: 2025-11-24 14:27:41.476535229 +0000 UTC m=+0.074762771 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350)
Nov 24 14:27:43 compute-0 podman[212710]: 2025-11-24 14:27:43.520019589 +0000 UTC m=+0.113295147 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 14:27:46 compute-0 podman[212737]: 2025-11-24 14:27:46.455403573 +0000 UTC m=+0.054453104 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:27:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:27:56.653 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:27:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:27:56.654 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:27:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:27:56.654 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:27:58 compute-0 podman[212761]: 2025-11-24 14:27:58.436736174 +0000 UTC m=+0.047156744 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:28:03 compute-0 podman[212784]: 2025-11-24 14:28:03.428014313 +0000 UTC m=+0.043890781 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 14:28:06 compute-0 podman[212804]: 2025-11-24 14:28:06.468486351 +0000 UTC m=+0.074708918 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 14:28:06 compute-0 podman[212805]: 2025-11-24 14:28:06.478808775 +0000 UTC m=+0.081674957 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:28:12 compute-0 podman[212840]: 2025-11-24 14:28:12.440549237 +0000 UTC m=+0.048280886 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 24 14:28:14 compute-0 podman[212861]: 2025-11-24 14:28:14.46942364 +0000 UTC m=+0.084011113 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 14:28:17 compute-0 podman[212889]: 2025-11-24 14:28:17.451269849 +0000 UTC m=+0.066843295 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:28:29 compute-0 podman[212913]: 2025-11-24 14:28:29.443581496 +0000 UTC m=+0.053195456 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:28:32 compute-0 nova_compute[187118]: 2025-11-24 14:28:32.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:33 compute-0 nova_compute[187118]: 2025-11-24 14:28:33.791 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:33 compute-0 nova_compute[187118]: 2025-11-24 14:28:33.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:33 compute-0 nova_compute[187118]: 2025-11-24 14:28:33.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:28:33 compute-0 nova_compute[187118]: 2025-11-24 14:28:33.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:28:33 compute-0 nova_compute[187118]: 2025-11-24 14:28:33.807 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:28:34 compute-0 podman[212938]: 2025-11-24 14:28:34.437487518 +0000 UTC m=+0.043654114 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:28:34 compute-0 nova_compute[187118]: 2025-11-24 14:28:34.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:34 compute-0 nova_compute[187118]: 2025-11-24 14:28:34.811 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.825 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.963 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.964 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6041MB free_disk=73.4970932006836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.964 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:28:35 compute-0 nova_compute[187118]: 2025-11-24 14:28:35.965 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:28:36 compute-0 nova_compute[187118]: 2025-11-24 14:28:36.018 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:28:36 compute-0 nova_compute[187118]: 2025-11-24 14:28:36.019 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:28:36 compute-0 nova_compute[187118]: 2025-11-24 14:28:36.039 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:28:36 compute-0 nova_compute[187118]: 2025-11-24 14:28:36.051 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:28:36 compute-0 nova_compute[187118]: 2025-11-24 14:28:36.052 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:28:36 compute-0 nova_compute[187118]: 2025-11-24 14:28:36.052 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:28:37 compute-0 nova_compute[187118]: 2025-11-24 14:28:37.052 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:37 compute-0 podman[212959]: 2025-11-24 14:28:37.471238846 +0000 UTC m=+0.071812605 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:28:37 compute-0 podman[212958]: 2025-11-24 14:28:37.513915671 +0000 UTC m=+0.110782265 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 14:28:37 compute-0 nova_compute[187118]: 2025-11-24 14:28:37.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:28:43 compute-0 podman[212998]: 2025-11-24 14:28:43.461487501 +0000 UTC m=+0.071204259 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal)
Nov 24 14:28:45 compute-0 podman[213019]: 2025-11-24 14:28:45.516998432 +0000 UTC m=+0.120980286 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:28:48 compute-0 podman[213046]: 2025-11-24 14:28:48.479777648 +0000 UTC m=+0.075170001 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:28:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:28:56.654 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:28:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:28:56.655 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:28:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:28:56.655 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:00 compute-0 podman[213070]: 2025-11-24 14:29:00.455227822 +0000 UTC m=+0.060784868 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:29:05 compute-0 podman[213094]: 2025-11-24 14:29:05.492133008 +0000 UTC m=+0.085119507 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:29:08 compute-0 podman[213115]: 2025-11-24 14:29:08.438318777 +0000 UTC m=+0.048850825 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 24 14:29:08 compute-0 podman[213114]: 2025-11-24 14:29:08.463947242 +0000 UTC m=+0.077026359 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:29:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:14.329 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:29:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:14.331 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:29:14 compute-0 podman[213153]: 2025-11-24 14:29:14.451861247 +0000 UTC m=+0.065870676 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7)
Nov 24 14:29:16 compute-0 podman[213174]: 2025-11-24 14:29:16.51327367 +0000 UTC m=+0.123046805 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:29:17 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:17.333 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:19 compute-0 podman[213202]: 2025-11-24 14:29:19.427485923 +0000 UTC m=+0.042039450 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:29:31 compute-0 podman[213226]: 2025-11-24 14:29:31.521066349 +0000 UTC m=+0.117652189 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:29:31 compute-0 nova_compute[187118]: 2025-11-24 14:29:31.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:31 compute-0 nova_compute[187118]: 2025-11-24 14:29:31.798 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 14:29:31 compute-0 nova_compute[187118]: 2025-11-24 14:29:31.822 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 14:29:31 compute-0 nova_compute[187118]: 2025-11-24 14:29:31.823 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:31 compute-0 nova_compute[187118]: 2025-11-24 14:29:31.823 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 14:29:31 compute-0 nova_compute[187118]: 2025-11-24 14:29:31.839 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:32 compute-0 nova_compute[187118]: 2025-11-24 14:29:32.850 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:33 compute-0 nova_compute[187118]: 2025-11-24 14:29:33.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:33 compute-0 nova_compute[187118]: 2025-11-24 14:29:33.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:29:33 compute-0 nova_compute[187118]: 2025-11-24 14:29:33.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:29:33 compute-0 nova_compute[187118]: 2025-11-24 14:29:33.810 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.135 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.136 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.152 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.311 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.312 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.321 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.321 187122 INFO nova.compute.claims [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.462 187122 DEBUG nova.scheduler.client.report [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Refreshing inventories for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.508 187122 DEBUG nova.scheduler.client.report [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Updating ProviderTree inventory for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.510 187122 DEBUG nova.compute.provider_tree [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.528 187122 DEBUG nova.scheduler.client.report [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Refreshing aggregate associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.552 187122 DEBUG nova.scheduler.client.report [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Refreshing trait associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, traits: COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AESNI,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.597 187122 DEBUG nova.compute.provider_tree [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.611 187122 DEBUG nova.scheduler.client.report [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.645 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.646 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.693 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.693 187122 DEBUG nova.network.neutron [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.711 187122 INFO nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.726 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.798 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.800 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.800 187122 INFO nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Creating image(s)
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.801 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.801 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.802 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.803 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:34 compute-0 nova_compute[187118]: 2025-11-24 14:29:34.803 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.126 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:29:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.238 187122 WARNING oslo_policy.policy [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.239 187122 WARNING oslo_policy.policy [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.240 187122 DEBUG nova.policy [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.827 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.827 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.827 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:35 compute-0 nova_compute[187118]: 2025-11-24 14:29:35.827 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.171 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.173 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6071MB free_disk=73.48142623901367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.173 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.174 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.204 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.255 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance bb69573d-afb8-4ab1-833e-04ae871dcad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.256 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.256 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.274 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.part --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.276 187122 DEBUG nova.virt.images [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] 54a328f6-92ea-410e-beaf-ba04bab9ef9a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.277 187122 DEBUG nova.privsep.utils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.278 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.part /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.338 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.404 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updated inventory for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.405 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.405 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.414 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.part /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.converted" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.420 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.437 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.438 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.264s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:36 compute-0 podman[213263]: 2025-11-24 14:29:36.463197987 +0000 UTC m=+0.055104613 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.473 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca.converted --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.474 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.490 187122 INFO oslo.privsep.daemon [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpeojrllvr/privsep.sock']
Nov 24 14:29:36 compute-0 nova_compute[187118]: 2025-11-24 14:29:36.587 187122 DEBUG nova.network.neutron [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Successfully created port: aba4af13-ceac-4d72-af85-e39af5aec20c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.158 187122 INFO oslo.privsep.daemon [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Spawned new privsep daemon via rootwrap
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.044 213288 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.047 213288 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.049 213288 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.049 213288 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213288
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.253 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.303 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.305 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.305 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.316 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.366 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.367 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.398 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.400 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.400 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.439 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.440 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.440 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.454 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.455 187122 DEBUG nova.virt.disk.api [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.456 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.510 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.511 187122 DEBUG nova.virt.disk.api [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.511 187122 DEBUG nova.objects.instance [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid bb69573d-afb8-4ab1-833e-04ae871dcad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.527 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.527 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Ensure instance console log exists: /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.527 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.528 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:37 compute-0 nova_compute[187118]: 2025-11-24 14:29:37.528 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.150 187122 DEBUG nova.network.neutron [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Successfully updated port: aba4af13-ceac-4d72-af85-e39af5aec20c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.165 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.165 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.166 187122 DEBUG nova.network.neutron [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.307 187122 DEBUG nova.network.neutron [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.623 187122 DEBUG nova.compute.manager [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-changed-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.624 187122 DEBUG nova.compute.manager [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Refreshing instance network info cache due to event network-changed-aba4af13-ceac-4d72-af85-e39af5aec20c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:29:38 compute-0 nova_compute[187118]: 2025-11-24 14:29:38.624 187122 DEBUG oslo_concurrency.lockutils [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:29:39 compute-0 podman[213306]: 2025-11-24 14:29:39.436316867 +0000 UTC m=+0.047777236 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 24 14:29:39 compute-0 podman[213305]: 2025-11-24 14:29:39.439552814 +0000 UTC m=+0.053045608 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.466 187122 DEBUG nova.network.neutron [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updating instance_info_cache with network_info: [{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.486 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.487 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Instance network_info: |[{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.487 187122 DEBUG oslo_concurrency.lockutils [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.487 187122 DEBUG nova.network.neutron [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Refreshing network info cache for port aba4af13-ceac-4d72-af85-e39af5aec20c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.493 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Start _get_guest_xml network_info=[{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.501 187122 WARNING nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.510 187122 DEBUG nova.virt.libvirt.host [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.510 187122 DEBUG nova.virt.libvirt.host [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.516 187122 DEBUG nova.virt.libvirt.host [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.517 187122 DEBUG nova.virt.libvirt.host [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.517 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.517 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.518 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.518 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.518 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.518 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.518 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.519 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.519 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.519 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.519 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.519 187122 DEBUG nova.virt.hardware [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.523 187122 DEBUG nova.privsep.utils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.523 187122 DEBUG nova.virt.libvirt.vif [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1032848293',display_name='tempest-TestNetworkBasicOps-server-1032848293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1032848293',id=1,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAg+4WDbs9dvf+ZfMdzWe+3frfT2iNST3PznRg9a5JPQ5k9XOSE1hSYOU22Jk4o7BXQkELAaKL7sRamVWrjqRPyn6FTi2nPDS7facyBL0RDXEHXBjDvsTuuOIpe2dedRLA==',key_name='tempest-TestNetworkBasicOps-2128408494',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-t0qg6gl6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:29:34Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bb69573d-afb8-4ab1-833e-04ae871dcad7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.524 187122 DEBUG nova.network.os_vif_util [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.524 187122 DEBUG nova.network.os_vif_util [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.526 187122 DEBUG nova.objects.instance [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid bb69573d-afb8-4ab1-833e-04ae871dcad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.538 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <uuid>bb69573d-afb8-4ab1-833e-04ae871dcad7</uuid>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <name>instance-00000001</name>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-1032848293</nova:name>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:29:39</nova:creationTime>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         <nova:port uuid="aba4af13-ceac-4d72-af85-e39af5aec20c">
Nov 24 14:29:39 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <system>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <entry name="serial">bb69573d-afb8-4ab1-833e-04ae871dcad7</entry>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <entry name="uuid">bb69573d-afb8-4ab1-833e-04ae871dcad7</entry>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </system>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <os>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </os>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <features>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </features>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.config"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:25:a2:ef"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <target dev="tapaba4af13-ce"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/console.log" append="off"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <video>
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </video>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:29:39 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:29:39 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:29:39 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:29:39 compute-0 nova_compute[187118]: </domain>
Nov 24 14:29:39 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.538 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Preparing to wait for external event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.538 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.539 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.539 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.539 187122 DEBUG nova.virt.libvirt.vif [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1032848293',display_name='tempest-TestNetworkBasicOps-server-1032848293',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1032848293',id=1,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAg+4WDbs9dvf+ZfMdzWe+3frfT2iNST3PznRg9a5JPQ5k9XOSE1hSYOU22Jk4o7BXQkELAaKL7sRamVWrjqRPyn6FTi2nPDS7facyBL0RDXEHXBjDvsTuuOIpe2dedRLA==',key_name='tempest-TestNetworkBasicOps-2128408494',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-t0qg6gl6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:29:34Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bb69573d-afb8-4ab1-833e-04ae871dcad7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.539 187122 DEBUG nova.network.os_vif_util [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.540 187122 DEBUG nova.network.os_vif_util [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.540 187122 DEBUG os_vif [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.570 187122 DEBUG ovsdbapp.backend.ovs_idl [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.570 187122 DEBUG ovsdbapp.backend.ovs_idl [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.570 187122 DEBUG ovsdbapp.backend.ovs_idl [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.571 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.571 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.572 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.572 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.573 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.575 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.582 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.583 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.583 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.584 187122 INFO oslo.privsep.daemon [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpkuer38pr/privsep.sock']
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.700 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:39 compute-0 nova_compute[187118]: 2025-11-24 14:29:39.798 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.217 187122 INFO oslo.privsep.daemon [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Spawned new privsep daemon via rootwrap
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.105 213351 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.112 213351 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.117 213351 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.117 213351 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213351
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.512 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.512 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaba4af13-ce, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.513 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaba4af13-ce, col_values=(('external_ids', {'iface-id': 'aba4af13-ceac-4d72-af85-e39af5aec20c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:25:a2:ef', 'vm-uuid': 'bb69573d-afb8-4ab1-833e-04ae871dcad7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.515 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:40 compute-0 NetworkManager[55697]: <info>  [1763994580.5169] manager: (tapaba4af13-ce): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.517 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.524 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.525 187122 INFO os_vif [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce')
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.579 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.579 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.579 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:25:a2:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:29:40 compute-0 nova_compute[187118]: 2025-11-24 14:29:40.580 187122 INFO nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Using config drive
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.562 187122 INFO nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Creating config drive at /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.config
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.567 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcrdoijqx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.639 187122 DEBUG nova.network.neutron [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updated VIF entry in instance network info cache for port aba4af13-ceac-4d72-af85-e39af5aec20c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.640 187122 DEBUG nova.network.neutron [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updating instance_info_cache with network_info: [{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.651 187122 DEBUG oslo_concurrency.lockutils [req-762c5418-b131-4bd1-917f-1926bc8bbad3 req-15eaeae0-a6f7-464d-9f17-042373ded155 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.688 187122 DEBUG oslo_concurrency.processutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcrdoijqx" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:29:41 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 24 14:29:41 compute-0 kernel: tapaba4af13-ce: entered promiscuous mode
Nov 24 14:29:41 compute-0 NetworkManager[55697]: <info>  [1763994581.7587] manager: (tapaba4af13-ce): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Nov 24 14:29:41 compute-0 ovn_controller[95613]: 2025-11-24T14:29:41Z|00027|binding|INFO|Claiming lport aba4af13-ceac-4d72-af85-e39af5aec20c for this chassis.
Nov 24 14:29:41 compute-0 ovn_controller[95613]: 2025-11-24T14:29:41Z|00028|binding|INFO|aba4af13-ceac-4d72-af85-e39af5aec20c: Claiming fa:16:3e:25:a2:ef 10.100.0.14
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.760 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.763 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:41.775 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:a2:ef 10.100.0.14'], port_security=['fa:16:3e:25:a2:ef 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bb69573d-afb8-4ab1-833e-04ae871dcad7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5f5ab2be-3ace-4e21-972d-0f2a6aba47d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe482998-02d7-4d4f-bc96-de688bc3ae29, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=aba4af13-ceac-4d72-af85-e39af5aec20c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:29:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:41.776 104469 INFO neutron.agent.ovn.metadata.agent [-] Port aba4af13-ceac-4d72-af85-e39af5aec20c in datapath ef00357c-8383-4ce4-bb83-80ee7be7b5b1 bound to our chassis
Nov 24 14:29:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:41.778 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef00357c-8383-4ce4-bb83-80ee7be7b5b1
Nov 24 14:29:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:41.779 104469 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpome7zcxh/privsep.sock']
Nov 24 14:29:41 compute-0 systemd-udevd[213378]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:29:41 compute-0 NetworkManager[55697]: <info>  [1763994581.8158] device (tapaba4af13-ce): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:29:41 compute-0 NetworkManager[55697]: <info>  [1763994581.8164] device (tapaba4af13-ce): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:29:41 compute-0 systemd-machined[153483]: New machine qemu-1-instance-00000001.
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.831 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:41 compute-0 ovn_controller[95613]: 2025-11-24T14:29:41Z|00029|binding|INFO|Setting lport aba4af13-ceac-4d72-af85-e39af5aec20c ovn-installed in OVS
Nov 24 14:29:41 compute-0 ovn_controller[95613]: 2025-11-24T14:29:41Z|00030|binding|INFO|Setting lport aba4af13-ceac-4d72-af85-e39af5aec20c up in Southbound
Nov 24 14:29:41 compute-0 nova_compute[187118]: 2025-11-24 14:29:41.836 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:41 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.457 104469 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.458 104469 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpome7zcxh/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.335 213394 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.339 213394 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.341 213394 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.342 213394 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213394
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.460 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[770c474b-73dc-4e3b-9807-37c5157e47a6]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.479 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994582.4790356, bb69573d-afb8-4ab1-833e-04ae871dcad7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.480 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] VM Started (Lifecycle Event)
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.554 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.558 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994582.4799197, bb69573d-afb8-4ab1-833e-04ae871dcad7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.558 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] VM Paused (Lifecycle Event)
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.581 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.585 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.626 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.707 187122 DEBUG nova.compute.manager [req-efd5136f-2fe1-4ed2-a7c1-358adb4bc35a req-1f846c94-2cd2-48fc-ad24-b27884537bce 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.708 187122 DEBUG oslo_concurrency.lockutils [req-efd5136f-2fe1-4ed2-a7c1-358adb4bc35a req-1f846c94-2cd2-48fc-ad24-b27884537bce 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.708 187122 DEBUG oslo_concurrency.lockutils [req-efd5136f-2fe1-4ed2-a7c1-358adb4bc35a req-1f846c94-2cd2-48fc-ad24-b27884537bce 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.708 187122 DEBUG oslo_concurrency.lockutils [req-efd5136f-2fe1-4ed2-a7c1-358adb4bc35a req-1f846c94-2cd2-48fc-ad24-b27884537bce 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.709 187122 DEBUG nova.compute.manager [req-efd5136f-2fe1-4ed2-a7c1-358adb4bc35a req-1f846c94-2cd2-48fc-ad24-b27884537bce 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Processing event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.709 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.713 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.715 187122 INFO nova.virt.libvirt.driver [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Instance spawned successfully.
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.715 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.723 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994582.7230415, bb69573d-afb8-4ab1-833e-04ae871dcad7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.723 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] VM Resumed (Lifecycle Event)
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.730 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.731 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.731 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.732 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.732 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.733 187122 DEBUG nova.virt.libvirt.driver [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.738 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.740 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.773 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.804 187122 INFO nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Took 8.00 seconds to spawn the instance on the hypervisor.
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.804 187122 DEBUG nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.864 187122 INFO nova.compute.manager [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Took 8.64 seconds to build instance.
Nov 24 14:29:42 compute-0 nova_compute[187118]: 2025-11-24 14:29:42.883 187122 DEBUG oslo_concurrency.lockutils [None req-f96a6adb-dd97-4fbb-bab9-20256f4dba62 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.932 213394 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.932 213394 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:42.932 213394 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.484 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee8cc34-0cb3-47ff-a115-34846d329a93]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.486 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef00357c-81 in ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.488 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef00357c-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.488 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[5b22bfe3-541f-48f6-a10b-403a5dd76a79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.492 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[73a4a07b-5dd3-4092-9a7a-4965fdae6e6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.515 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[34e2739a-c90a-4c1b-8c8c-e7360d803b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.544 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d449290c-1b26-453a-b5ff-948c047cac32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:43.547 104469 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpzksq2h1p/privsep.sock']
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.251 104469 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.252 104469 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpzksq2h1p/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.139 213415 INFO oslo.privsep.daemon [-] privsep daemon starting
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.143 213415 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.145 213415 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.146 213415 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213415
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.254 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ec125c-c037-4ba2-83ce-99ae3236fd61]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.702 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.723 213415 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.724 213415 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:44 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:44.724 213415 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.781 187122 DEBUG nova.compute.manager [req-590b5fa7-5299-4555-b058-2ba14de085d1 req-42a9cce1-69b3-4fbc-bf17-31f8f3ca94de 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.782 187122 DEBUG oslo_concurrency.lockutils [req-590b5fa7-5299-4555-b058-2ba14de085d1 req-42a9cce1-69b3-4fbc-bf17-31f8f3ca94de 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.783 187122 DEBUG oslo_concurrency.lockutils [req-590b5fa7-5299-4555-b058-2ba14de085d1 req-42a9cce1-69b3-4fbc-bf17-31f8f3ca94de 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.783 187122 DEBUG oslo_concurrency.lockutils [req-590b5fa7-5299-4555-b058-2ba14de085d1 req-42a9cce1-69b3-4fbc-bf17-31f8f3ca94de 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.784 187122 DEBUG nova.compute.manager [req-590b5fa7-5299-4555-b058-2ba14de085d1 req-42a9cce1-69b3-4fbc-bf17-31f8f3ca94de 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] No waiting events found dispatching network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:29:44 compute-0 nova_compute[187118]: 2025-11-24 14:29:44.785 187122 WARNING nova.compute.manager [req-590b5fa7-5299-4555-b058-2ba14de085d1 req-42a9cce1-69b3-4fbc-bf17-31f8f3ca94de 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received unexpected event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c for instance with vm_state active and task_state None.
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.296 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[ee7840f9-2b16-4695-a347-a7a3a9f0fa28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.312 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3216dc93-fd9c-4799-a5d6-95e7fb8c1d49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.3154] manager: (tapef00357c-80): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.343 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb8b2ef-c32a-498d-83cc-1f9ce6861288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.346 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[19a6fdef-dafd-4c75-a434-4d447b759d93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 systemd-udevd[213435]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.3713] device (tapef00357c-80): carrier: link connected
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.381 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[6475d071-65b7-49f8-ac3e-ad4635dcdbb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.413 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[170089dc-7c5c-4b71-8fc9-24b5187a4aa1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef00357c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:8d:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 285280, 'reachable_time': 17101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213447, 'error': None, 'target': 'ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.435 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[9e38a427-6dec-4f46-b53f-a329dae38dd3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec5:8d85'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 285280, 'tstamp': 285280}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213464, 'error': None, 'target': 'ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 podman[213424]: 2025-11-24 14:29:45.440621048 +0000 UTC m=+0.098532641 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal)
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.454 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[bf18d883-89ae-4023-aab1-fd46fa942b77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef00357c-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c5:8d:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 285280, 'reachable_time': 17101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213467, 'error': None, 'target': 'ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.480 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[6e02eba9-63df-4ae6-bf83-bcefdde43ac6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.516 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.530 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d88d08-a2af-4ff9-ae30-3d74939a4643]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.532 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef00357c-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.532 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.533 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef00357c-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:45 compute-0 kernel: tapef00357c-80: entered promiscuous mode
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.5359] manager: (tapef00357c-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.535 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.538 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef00357c-80, col_values=(('external_ids', {'iface-id': '3422fe53-fb68-45d5-b5e6-00b6f3797166'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.539 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 ovn_controller[95613]: 2025-11-24T14:29:45Z|00031|binding|INFO|Releasing lport 3422fe53-fb68-45d5-b5e6-00b6f3797166 from this chassis (sb_readonly=0)
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.544 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef00357c-8383-4ce4-bb83-80ee7be7b5b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef00357c-8383-4ce4-bb83-80ee7be7b5b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.545 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[fa9c2aa4-b0bf-4e6b-93de-907cb6b6ff76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.546 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-ef00357c-8383-4ce4-bb83-80ee7be7b5b1
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/ef00357c-8383-4ce4-bb83-80ee7be7b5b1.pid.haproxy
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID ef00357c-8383-4ce4-bb83-80ee7be7b5b1
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:29:45 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:45.546 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'env', 'PROCESS_TAG=haproxy-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef00357c-8383-4ce4-bb83-80ee7be7b5b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.553 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 ovn_controller[95613]: 2025-11-24T14:29:45Z|00032|binding|INFO|Releasing lport 3422fe53-fb68-45d5-b5e6-00b6f3797166 from this chassis (sb_readonly=0)
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8400] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.839 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8406] device (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8418] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8421] device (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8429] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8437] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8442] device (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 14:29:45 compute-0 NetworkManager[55697]: <info>  [1763994585.8447] device (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 24 14:29:45 compute-0 ovn_controller[95613]: 2025-11-24T14:29:45Z|00033|binding|INFO|Releasing lport 3422fe53-fb68-45d5-b5e6-00b6f3797166 from this chassis (sb_readonly=0)
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.873 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 nova_compute[187118]: 2025-11-24 14:29:45.877 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:45 compute-0 podman[213500]: 2025-11-24 14:29:45.915340382 +0000 UTC m=+0.058311220 container create 4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:29:45 compute-0 systemd[1]: Started libpod-conmon-4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15.scope.
Nov 24 14:29:45 compute-0 podman[213500]: 2025-11-24 14:29:45.885253498 +0000 UTC m=+0.028224376 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:29:45 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:29:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d06d07b21e7561b93d250c31a34b7471b9570511e8582a5c05972ba91cbf210/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:29:46 compute-0 podman[213500]: 2025-11-24 14:29:46.006208275 +0000 UTC m=+0.149179143 container init 4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:29:46 compute-0 podman[213500]: 2025-11-24 14:29:46.01821001 +0000 UTC m=+0.161180848 container start 4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:29:46 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [NOTICE]   (213519) : New worker (213521) forked
Nov 24 14:29:46 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [NOTICE]   (213519) : Loading success.
Nov 24 14:29:46 compute-0 nova_compute[187118]: 2025-11-24 14:29:46.923 187122 DEBUG nova.compute.manager [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-changed-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:29:46 compute-0 nova_compute[187118]: 2025-11-24 14:29:46.923 187122 DEBUG nova.compute.manager [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Refreshing instance network info cache due to event network-changed-aba4af13-ceac-4d72-af85-e39af5aec20c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:29:46 compute-0 nova_compute[187118]: 2025-11-24 14:29:46.924 187122 DEBUG oslo_concurrency.lockutils [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:29:46 compute-0 nova_compute[187118]: 2025-11-24 14:29:46.924 187122 DEBUG oslo_concurrency.lockutils [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:29:46 compute-0 nova_compute[187118]: 2025-11-24 14:29:46.924 187122 DEBUG nova.network.neutron [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Refreshing network info cache for port aba4af13-ceac-4d72-af85-e39af5aec20c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:29:47 compute-0 podman[213530]: 2025-11-24 14:29:47.475389758 +0000 UTC m=+0.081574861 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Nov 24 14:29:47 compute-0 nova_compute[187118]: 2025-11-24 14:29:47.935 187122 DEBUG nova.network.neutron [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updated VIF entry in instance network info cache for port aba4af13-ceac-4d72-af85-e39af5aec20c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:29:47 compute-0 nova_compute[187118]: 2025-11-24 14:29:47.937 187122 DEBUG nova.network.neutron [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updating instance_info_cache with network_info: [{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:29:47 compute-0 nova_compute[187118]: 2025-11-24 14:29:47.951 187122 DEBUG oslo_concurrency.lockutils [req-a0732f83-14f0-400e-9db1-7a892317bd09 req-86c11cd5-e5ca-4df8-bc6e-664bc30150eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:29:49 compute-0 nova_compute[187118]: 2025-11-24 14:29:49.704 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:50 compute-0 podman[213556]: 2025-11-24 14:29:50.460119132 +0000 UTC m=+0.059856453 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:29:50 compute-0 nova_compute[187118]: 2025-11-24 14:29:50.518 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:54 compute-0 ovn_controller[95613]: 2025-11-24T14:29:54Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:25:a2:ef 10.100.0.14
Nov 24 14:29:54 compute-0 ovn_controller[95613]: 2025-11-24T14:29:54Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:25:a2:ef 10.100.0.14
Nov 24 14:29:54 compute-0 nova_compute[187118]: 2025-11-24 14:29:54.710 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:55 compute-0 nova_compute[187118]: 2025-11-24 14:29:55.521 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:29:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:56.656 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:29:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:56.656 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:29:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:29:56.657 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:29:59 compute-0 nova_compute[187118]: 2025-11-24 14:29:59.713 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:00 compute-0 nova_compute[187118]: 2025-11-24 14:30:00.524 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:00 compute-0 nova_compute[187118]: 2025-11-24 14:30:00.733 187122 INFO nova.compute.manager [None req-80f1b580-95aa-4479-8bbe-8a4522ba9a65 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Get console output
Nov 24 14:30:00 compute-0 nova_compute[187118]: 2025-11-24 14:30:00.822 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:30:02 compute-0 podman[213594]: 2025-11-24 14:30:02.47363707 +0000 UTC m=+0.069822505 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:30:04 compute-0 nova_compute[187118]: 2025-11-24 14:30:04.715 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:05 compute-0 nova_compute[187118]: 2025-11-24 14:30:05.526 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:07 compute-0 podman[213619]: 2025-11-24 14:30:07.507300954 +0000 UTC m=+0.105958454 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.392 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.393 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.408 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.716 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.763 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.763 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.772 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.772 187122 INFO nova.compute.claims [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.934 187122 DEBUG nova.compute.provider_tree [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.946 187122 DEBUG nova.scheduler.client.report [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.964 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:09 compute-0 nova_compute[187118]: 2025-11-24 14:30:09.965 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.004 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.005 187122 DEBUG nova.network.neutron [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.017 187122 INFO nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.030 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.120 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.121 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.122 187122 INFO nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Creating image(s)
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.122 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.122 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.123 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.134 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.201 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.202 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.202 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.212 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.227 187122 DEBUG nova.policy [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.263 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.264 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.294 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.295 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.296 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.351 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.353 187122 DEBUG nova.virt.disk.api [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.353 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.409 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.410 187122 DEBUG nova.virt.disk.api [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.410 187122 DEBUG nova.objects.instance [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 992fc509-2c64-4e90-8c91-9e657e37b9c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.425 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.426 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Ensure instance console log exists: /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.426 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.426 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.427 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:10 compute-0 podman[213652]: 2025-11-24 14:30:10.449619908 +0000 UTC m=+0.058101076 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 14:30:10 compute-0 podman[213653]: 2025-11-24 14:30:10.449629338 +0000 UTC m=+0.053807700 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.529 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:10 compute-0 nova_compute[187118]: 2025-11-24 14:30:10.890 187122 DEBUG nova.network.neutron [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Successfully created port: bdac8846-16d8-4956-86de-7562233b3a16 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:30:14 compute-0 nova_compute[187118]: 2025-11-24 14:30:14.719 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.207 187122 DEBUG nova.network.neutron [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Successfully updated port: bdac8846-16d8-4956-86de-7562233b3a16 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.223 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-992fc509-2c64-4e90-8c91-9e657e37b9c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.223 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-992fc509-2c64-4e90-8c91-9e657e37b9c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.224 187122 DEBUG nova.network.neutron [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.531 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.638 187122 DEBUG nova.compute.manager [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-changed-bdac8846-16d8-4956-86de-7562233b3a16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.638 187122 DEBUG nova.compute.manager [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Refreshing instance network info cache due to event network-changed-bdac8846-16d8-4956-86de-7562233b3a16. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:30:15 compute-0 nova_compute[187118]: 2025-11-24 14:30:15.638 187122 DEBUG oslo_concurrency.lockutils [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-992fc509-2c64-4e90-8c91-9e657e37b9c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:30:16 compute-0 nova_compute[187118]: 2025-11-24 14:30:16.033 187122 DEBUG nova.network.neutron [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:30:16 compute-0 podman[213694]: 2025-11-24 14:30:16.491145831 +0000 UTC m=+0.095977864 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, version=9.6, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.426 187122 DEBUG nova.network.neutron [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Updating instance_info_cache with network_info: [{"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.440 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-992fc509-2c64-4e90-8c91-9e657e37b9c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.441 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Instance network_info: |[{"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.441 187122 DEBUG oslo_concurrency.lockutils [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-992fc509-2c64-4e90-8c91-9e657e37b9c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.441 187122 DEBUG nova.network.neutron [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Refreshing network info cache for port bdac8846-16d8-4956-86de-7562233b3a16 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.443 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Start _get_guest_xml network_info=[{"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.448 187122 WARNING nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.452 187122 DEBUG nova.virt.libvirt.host [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.453 187122 DEBUG nova.virt.libvirt.host [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.460 187122 DEBUG nova.virt.libvirt.host [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.461 187122 DEBUG nova.virt.libvirt.host [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.461 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.461 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.462 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.462 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.462 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.462 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.463 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.463 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.463 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.463 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.464 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.464 187122 DEBUG nova.virt.hardware [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.467 187122 DEBUG nova.virt.libvirt.vif [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1381207947',display_name='tempest-TestNetworkBasicOps-server-1381207947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1381207947',id=2,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKz1zKAjc+yf5li4da8XIvqWDXRuTDpiaslTgmqHXzBGz94VFVJhUaQqgWe3b+EYB3Rh82kd8dRo5mKSkVT1NTEeg5LwcEe3f+FXePfWPOOSqrhT5QwRKjg0U9ej6crFZQ==',key_name='tempest-TestNetworkBasicOps-42396895',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-xxy4lmcs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:30:10Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=992fc509-2c64-4e90-8c91-9e657e37b9c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.468 187122 DEBUG nova.network.os_vif_util [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.468 187122 DEBUG nova.network.os_vif_util [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.469 187122 DEBUG nova.objects.instance [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 992fc509-2c64-4e90-8c91-9e657e37b9c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.481 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <uuid>992fc509-2c64-4e90-8c91-9e657e37b9c1</uuid>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <name>instance-00000002</name>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-1381207947</nova:name>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:30:17</nova:creationTime>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         <nova:port uuid="bdac8846-16d8-4956-86de-7562233b3a16">
Nov 24 14:30:17 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <system>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <entry name="serial">992fc509-2c64-4e90-8c91-9e657e37b9c1</entry>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <entry name="uuid">992fc509-2c64-4e90-8c91-9e657e37b9c1</entry>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </system>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <os>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </os>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <features>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </features>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.config"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:47:2c:3f"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <target dev="tapbdac8846-16"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/console.log" append="off"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <video>
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </video>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:30:17 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:30:17 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:30:17 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:30:17 compute-0 nova_compute[187118]: </domain>
Nov 24 14:30:17 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.482 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Preparing to wait for external event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.482 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.483 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.483 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.485 187122 DEBUG nova.virt.libvirt.vif [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1381207947',display_name='tempest-TestNetworkBasicOps-server-1381207947',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1381207947',id=2,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKz1zKAjc+yf5li4da8XIvqWDXRuTDpiaslTgmqHXzBGz94VFVJhUaQqgWe3b+EYB3Rh82kd8dRo5mKSkVT1NTEeg5LwcEe3f+FXePfWPOOSqrhT5QwRKjg0U9ej6crFZQ==',key_name='tempest-TestNetworkBasicOps-42396895',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-xxy4lmcs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:30:10Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=992fc509-2c64-4e90-8c91-9e657e37b9c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.486 187122 DEBUG nova.network.os_vif_util [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.487 187122 DEBUG nova.network.os_vif_util [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.488 187122 DEBUG os_vif [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.488 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.489 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.490 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.494 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.494 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbdac8846-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.495 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbdac8846-16, col_values=(('external_ids', {'iface-id': 'bdac8846-16d8-4956-86de-7562233b3a16', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:2c:3f', 'vm-uuid': '992fc509-2c64-4e90-8c91-9e657e37b9c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:17 compute-0 NetworkManager[55697]: <info>  [1763994617.4993] manager: (tapbdac8846-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.497 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.500 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.503 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.504 187122 INFO os_vif [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16')
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.554 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.555 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.555 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:47:2c:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:30:17 compute-0 nova_compute[187118]: 2025-11-24 14:30:17.556 187122 INFO nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Using config drive
Nov 24 14:30:17 compute-0 podman[213719]: 2025-11-24 14:30:17.622360845 +0000 UTC m=+0.087949727 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.094 187122 INFO nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Creating config drive at /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.config
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.104 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkf_lk9xt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.246 187122 DEBUG oslo_concurrency.processutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkf_lk9xt" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:18 compute-0 kernel: tapbdac8846-16: entered promiscuous mode
Nov 24 14:30:18 compute-0 NetworkManager[55697]: <info>  [1763994618.3072] manager: (tapbdac8846-16): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Nov 24 14:30:18 compute-0 ovn_controller[95613]: 2025-11-24T14:30:18Z|00034|binding|INFO|Claiming lport bdac8846-16d8-4956-86de-7562233b3a16 for this chassis.
Nov 24 14:30:18 compute-0 ovn_controller[95613]: 2025-11-24T14:30:18Z|00035|binding|INFO|bdac8846-16d8-4956-86de-7562233b3a16: Claiming fa:16:3e:47:2c:3f 10.100.0.26
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.309 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.320 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:2c:3f 10.100.0.26'], port_security=['fa:16:3e:47:2c:3f 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': '992fc509-2c64-4e90-8c91-9e657e37b9c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5001df36-3c56-487d-82eb-e715c1547595', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '724cd23e-7083-4a93-8f5c-b220fb28f0a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b89d9e7-725d-454e-84a0-64d6572ebc8b, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=bdac8846-16d8-4956-86de-7562233b3a16) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.321 104469 INFO neutron.agent.ovn.metadata.agent [-] Port bdac8846-16d8-4956-86de-7562233b3a16 in datapath 5001df36-3c56-487d-82eb-e715c1547595 bound to our chassis
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.322 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5001df36-3c56-487d-82eb-e715c1547595
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.332 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e9c28234-68ca-4635-ac81-d53003b77b98]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.333 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5001df36-31 in ovnmeta-5001df36-3c56-487d-82eb-e715c1547595 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.335 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5001df36-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.335 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[db695fee-a65f-4c95-ab90-f0ea06dfd5a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.337 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3757f47c-b2d2-4637-a856-0dda49390aed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 systemd-udevd[213764]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.363 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 ovn_controller[95613]: 2025-11-24T14:30:18Z|00036|binding|INFO|Setting lport bdac8846-16d8-4956-86de-7562233b3a16 ovn-installed in OVS
Nov 24 14:30:18 compute-0 ovn_controller[95613]: 2025-11-24T14:30:18Z|00037|binding|INFO|Setting lport bdac8846-16d8-4956-86de-7562233b3a16 up in Southbound
Nov 24 14:30:18 compute-0 NetworkManager[55697]: <info>  [1763994618.3668] device (tapbdac8846-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:30:18 compute-0 NetworkManager[55697]: <info>  [1763994618.3675] device (tapbdac8846-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.367 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.368 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[872c4ec6-5f77-4b78-a04b-a5cee002582d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 systemd-machined[153483]: New machine qemu-2-instance-00000002.
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.380 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7900e8e3-740e-4648-b2ab-edafdb8eead7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.408 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa44020-3ccb-45c1-8e31-35003a66039f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.412 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1e56a4d3-bcc2-4ce6-b504-c0bc8789c543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 NetworkManager[55697]: <info>  [1763994618.4137] manager: (tap5001df36-30): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.441 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[2367a0ec-4fee-4b35-a2fb-59f77844e518]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.444 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[2393e0a0-f5e7-4bcb-8a29-e8799c98ed0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 NetworkManager[55697]: <info>  [1763994618.4641] device (tap5001df36-30): carrier: link connected
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.467 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[b1985cb9-1a9a-45dd-b378-a18f854f4607]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.479 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[63e377f1-7cf3-4b36-aa4c-4f5e022df91e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5001df36-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:f9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 288589, 'reachable_time': 35649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213797, 'error': None, 'target': 'ovnmeta-5001df36-3c56-487d-82eb-e715c1547595', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.490 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e51a8bee-ea9d-4291-9cf1-c8adc9ca3ad5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:f957'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 288589, 'tstamp': 288589}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213798, 'error': None, 'target': 'ovnmeta-5001df36-3c56-487d-82eb-e715c1547595', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.505 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[97a94f05-96a9-4523-aafd-e324c9a74248]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5001df36-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:f9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 288589, 'reachable_time': 35649, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213799, 'error': None, 'target': 'ovnmeta-5001df36-3c56-487d-82eb-e715c1547595', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.525 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[054e1c11-c6e7-4f82-b4b0-288662d0e4ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.541 187122 DEBUG nova.compute.manager [req-6c747d06-f4de-4f8d-8231-b11619aa1424 req-eee74813-77dc-4214-96b0-bd0df63908af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.541 187122 DEBUG oslo_concurrency.lockutils [req-6c747d06-f4de-4f8d-8231-b11619aa1424 req-eee74813-77dc-4214-96b0-bd0df63908af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.542 187122 DEBUG oslo_concurrency.lockutils [req-6c747d06-f4de-4f8d-8231-b11619aa1424 req-eee74813-77dc-4214-96b0-bd0df63908af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.542 187122 DEBUG oslo_concurrency.lockutils [req-6c747d06-f4de-4f8d-8231-b11619aa1424 req-eee74813-77dc-4214-96b0-bd0df63908af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.542 187122 DEBUG nova.compute.manager [req-6c747d06-f4de-4f8d-8231-b11619aa1424 req-eee74813-77dc-4214-96b0-bd0df63908af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Processing event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.569 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0f345e2e-992e-417b-aab7-672ac2b7e0ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.570 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5001df36-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.570 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.570 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5001df36-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:18 compute-0 kernel: tap5001df36-30: entered promiscuous mode
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.572 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 NetworkManager[55697]: <info>  [1763994618.5740] manager: (tap5001df36-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.574 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.575 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5001df36-30, col_values=(('external_ids', {'iface-id': 'ce5a3d0d-f7b9-49b4-b6de-b1e0d7b71ece'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.576 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 ovn_controller[95613]: 2025-11-24T14:30:18Z|00038|binding|INFO|Releasing lport ce5a3d0d-f7b9-49b4-b6de-b1e0d7b71ece from this chassis (sb_readonly=0)
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.577 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.577 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5001df36-3c56-487d-82eb-e715c1547595.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5001df36-3c56-487d-82eb-e715c1547595.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.578 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8e61670f-160d-490a-a7c4-16da9d442b07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.578 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-5001df36-3c56-487d-82eb-e715c1547595
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/5001df36-3c56-487d-82eb-e715c1547595.pid.haproxy
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID 5001df36-3c56-487d-82eb-e715c1547595
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:30:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:18.579 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5001df36-3c56-487d-82eb-e715c1547595', 'env', 'PROCESS_TAG=haproxy-5001df36-3c56-487d-82eb-e715c1547595', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5001df36-3c56-487d-82eb-e715c1547595.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.588 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.936 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994618.9358964, 992fc509-2c64-4e90-8c91-9e657e37b9c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.936 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] VM Started (Lifecycle Event)
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.938 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.942 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.945 187122 INFO nova.virt.libvirt.driver [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Instance spawned successfully.
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.945 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:30:18 compute-0 podman[213835]: 2025-11-24 14:30:18.955179425 +0000 UTC m=+0.066375951 container create 3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.965 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:30:18 compute-0 nova_compute[187118]: 2025-11-24 14:30:18.968 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:30:18 compute-0 systemd[1]: Started libpod-conmon-3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0.scope.
Nov 24 14:30:19 compute-0 podman[213835]: 2025-11-24 14:30:18.911584293 +0000 UTC m=+0.022780849 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.010 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.010 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994618.936117, 992fc509-2c64-4e90-8c91-9e657e37b9c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.011 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] VM Paused (Lifecycle Event)
Nov 24 14:30:19 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:30:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78c95f4ed10827dfa3bc5a150f9baa96a9616f6d716e638dd1fe149757dc7b8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:30:19 compute-0 podman[213835]: 2025-11-24 14:30:19.026230652 +0000 UTC m=+0.137427188 container init 3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.029 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.033 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994618.9412951, 992fc509-2c64-4e90-8c91-9e657e37b9c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.033 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] VM Resumed (Lifecycle Event)
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.035 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.036 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.036 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.036 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.037 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.037 187122 DEBUG nova.virt.libvirt.driver [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:30:19 compute-0 podman[213835]: 2025-11-24 14:30:19.038609117 +0000 UTC m=+0.149805633 container start 3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.054 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.056 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:30:19 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [NOTICE]   (213855) : New worker (213857) forked
Nov 24 14:30:19 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [NOTICE]   (213855) : Loading success.
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.075 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.137 187122 INFO nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Took 9.02 seconds to spawn the instance on the hypervisor.
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.137 187122 DEBUG nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.190 187122 INFO nova.compute.manager [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Took 9.46 seconds to build instance.
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.208 187122 DEBUG oslo_concurrency.lockutils [None req-8fd41581-e726-405a-9eff-4d55c1462142 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:19 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:19.272 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.272 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:19 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:19.273 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.338 187122 DEBUG nova.network.neutron [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Updated VIF entry in instance network info cache for port bdac8846-16d8-4956-86de-7562233b3a16. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.338 187122 DEBUG nova.network.neutron [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Updating instance_info_cache with network_info: [{"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.353 187122 DEBUG oslo_concurrency.lockutils [req-8f2d6434-3c40-4133-a642-36cb3fa98437 req-b0d51fd1-051f-4983-99a7-9a7f6c2b2dec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-992fc509-2c64-4e90-8c91-9e657e37b9c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:30:19 compute-0 nova_compute[187118]: 2025-11-24 14:30:19.721 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:20 compute-0 nova_compute[187118]: 2025-11-24 14:30:20.602 187122 DEBUG nova.compute.manager [req-8585a445-77ca-4c1c-97c5-555ded81a6ef req-0ae34365-3382-4161-826c-4f58638b1fa0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:20 compute-0 nova_compute[187118]: 2025-11-24 14:30:20.603 187122 DEBUG oslo_concurrency.lockutils [req-8585a445-77ca-4c1c-97c5-555ded81a6ef req-0ae34365-3382-4161-826c-4f58638b1fa0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:20 compute-0 nova_compute[187118]: 2025-11-24 14:30:20.603 187122 DEBUG oslo_concurrency.lockutils [req-8585a445-77ca-4c1c-97c5-555ded81a6ef req-0ae34365-3382-4161-826c-4f58638b1fa0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:20 compute-0 nova_compute[187118]: 2025-11-24 14:30:20.603 187122 DEBUG oslo_concurrency.lockutils [req-8585a445-77ca-4c1c-97c5-555ded81a6ef req-0ae34365-3382-4161-826c-4f58638b1fa0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:20 compute-0 nova_compute[187118]: 2025-11-24 14:30:20.603 187122 DEBUG nova.compute.manager [req-8585a445-77ca-4c1c-97c5-555ded81a6ef req-0ae34365-3382-4161-826c-4f58638b1fa0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] No waiting events found dispatching network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:30:20 compute-0 nova_compute[187118]: 2025-11-24 14:30:20.603 187122 WARNING nova.compute.manager [req-8585a445-77ca-4c1c-97c5-555ded81a6ef req-0ae34365-3382-4161-826c-4f58638b1fa0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received unexpected event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 for instance with vm_state active and task_state None.
Nov 24 14:30:21 compute-0 podman[213868]: 2025-11-24 14:30:21.475429184 +0000 UTC m=+0.075278032 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:30:22 compute-0 nova_compute[187118]: 2025-11-24 14:30:22.498 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:24 compute-0 nova_compute[187118]: 2025-11-24 14:30:24.724 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:27 compute-0 nova_compute[187118]: 2025-11-24 14:30:27.500 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:28.275 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:29 compute-0 nova_compute[187118]: 2025-11-24 14:30:29.726 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:31 compute-0 ovn_controller[95613]: 2025-11-24T14:30:31Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:47:2c:3f 10.100.0.26
Nov 24 14:30:31 compute-0 ovn_controller[95613]: 2025-11-24T14:30:31Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:47:2c:3f 10.100.0.26
Nov 24 14:30:32 compute-0 nova_compute[187118]: 2025-11-24 14:30:32.506 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:32 compute-0 nova_compute[187118]: 2025-11-24 14:30:32.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:33 compute-0 podman[213901]: 2025-11-24 14:30:33.460800273 +0000 UTC m=+0.073416023 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:30:34 compute-0 nova_compute[187118]: 2025-11-24 14:30:34.728 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:34 compute-0 nova_compute[187118]: 2025-11-24 14:30:34.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:34 compute-0 nova_compute[187118]: 2025-11-24 14:30:34.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:30:34 compute-0 nova_compute[187118]: 2025-11-24 14:30:34.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:30:35 compute-0 nova_compute[187118]: 2025-11-24 14:30:35.198 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:30:35 compute-0 nova_compute[187118]: 2025-11-24 14:30:35.198 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquired lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:30:35 compute-0 nova_compute[187118]: 2025-11-24 14:30:35.199 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 14:30:35 compute-0 nova_compute[187118]: 2025-11-24 14:30:35.199 187122 DEBUG nova.objects.instance [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bb69573d-afb8-4ab1-833e-04ae871dcad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:30:36 compute-0 nova_compute[187118]: 2025-11-24 14:30:36.262 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updating instance_info_cache with network_info: [{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:30:36 compute-0 nova_compute[187118]: 2025-11-24 14:30:36.280 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Releasing lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:30:36 compute-0 nova_compute[187118]: 2025-11-24 14:30:36.280 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 14:30:36 compute-0 nova_compute[187118]: 2025-11-24 14:30:36.280 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:36 compute-0 nova_compute[187118]: 2025-11-24 14:30:36.281 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.274 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.508 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.820 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.820 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.820 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:30:37 compute-0 nova_compute[187118]: 2025-11-24 14:30:37.907 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:37 compute-0 podman[213928]: 2025-11-24 14:30:37.970057734 +0000 UTC m=+0.086109226 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.005 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.007 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.071 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.078 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.131 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.132 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.215 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.395 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.396 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5462MB free_disk=73.40531921386719GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.396 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.396 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.478 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance bb69573d-afb8-4ab1-833e-04ae871dcad7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.479 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 992fc509-2c64-4e90-8c91-9e657e37b9c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.479 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.479 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.528 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.545 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.569 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:30:38 compute-0 nova_compute[187118]: 2025-11-24 14:30:38.569 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:39 compute-0 nova_compute[187118]: 2025-11-24 14:30:39.730 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:40 compute-0 nova_compute[187118]: 2025-11-24 14:30:40.567 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:30:41 compute-0 podman[213959]: 2025-11-24 14:30:41.456300387 +0000 UTC m=+0.060503812 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:30:41 compute-0 podman[213960]: 2025-11-24 14:30:41.473886714 +0000 UTC m=+0.070315108 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.391 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.392 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.392 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.393 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.393 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.395 187122 INFO nova.compute.manager [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Terminating instance
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.396 187122 DEBUG nova.compute.manager [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:30:42 compute-0 kernel: tapbdac8846-16 (unregistering): left promiscuous mode
Nov 24 14:30:42 compute-0 NetworkManager[55697]: <info>  [1763994642.4243] device (tapbdac8846-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:30:42 compute-0 ovn_controller[95613]: 2025-11-24T14:30:42Z|00039|binding|INFO|Releasing lport bdac8846-16d8-4956-86de-7562233b3a16 from this chassis (sb_readonly=0)
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.434 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 ovn_controller[95613]: 2025-11-24T14:30:42Z|00040|binding|INFO|Setting lport bdac8846-16d8-4956-86de-7562233b3a16 down in Southbound
Nov 24 14:30:42 compute-0 ovn_controller[95613]: 2025-11-24T14:30:42Z|00041|binding|INFO|Removing iface tapbdac8846-16 ovn-installed in OVS
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.438 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.446 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:2c:3f 10.100.0.26'], port_security=['fa:16:3e:47:2c:3f 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': '992fc509-2c64-4e90-8c91-9e657e37b9c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5001df36-3c56-487d-82eb-e715c1547595', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '724cd23e-7083-4a93-8f5c-b220fb28f0a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b89d9e7-725d-454e-84a0-64d6572ebc8b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=bdac8846-16d8-4956-86de-7562233b3a16) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.449 104469 INFO neutron.agent.ovn.metadata.agent [-] Port bdac8846-16d8-4956-86de-7562233b3a16 in datapath 5001df36-3c56-487d-82eb-e715c1547595 unbound from our chassis
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.451 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5001df36-3c56-487d-82eb-e715c1547595, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.453 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.453 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[9e4ecef6-f54b-4378-afed-11fc5d13da77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.455 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5001df36-3c56-487d-82eb-e715c1547595 namespace which is not needed anymore
Nov 24 14:30:42 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 24 14:30:42 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 13.008s CPU time.
Nov 24 14:30:42 compute-0 systemd-machined[153483]: Machine qemu-2-instance-00000002 terminated.
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.510 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [NOTICE]   (213855) : haproxy version is 2.8.14-c23fe91
Nov 24 14:30:42 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [NOTICE]   (213855) : path to executable is /usr/sbin/haproxy
Nov 24 14:30:42 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [WARNING]  (213855) : Exiting Master process...
Nov 24 14:30:42 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [ALERT]    (213855) : Current worker (213857) exited with code 143 (Terminated)
Nov 24 14:30:42 compute-0 neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595[213851]: [WARNING]  (213855) : All workers exited. Exiting... (0)
Nov 24 14:30:42 compute-0 systemd[1]: libpod-3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0.scope: Deactivated successfully.
Nov 24 14:30:42 compute-0 podman[214024]: 2025-11-24 14:30:42.631951025 +0000 UTC m=+0.053544013 container died 3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:30:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0-userdata-shm.mount: Deactivated successfully.
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.660 187122 INFO nova.virt.libvirt.driver [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Instance destroyed successfully.
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.660 187122 DEBUG nova.objects.instance [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 992fc509-2c64-4e90-8c91-9e657e37b9c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:30:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-78c95f4ed10827dfa3bc5a150f9baa96a9616f6d716e638dd1fe149757dc7b8c-merged.mount: Deactivated successfully.
Nov 24 14:30:42 compute-0 podman[214024]: 2025-11-24 14:30:42.670131391 +0000 UTC m=+0.091724389 container cleanup 3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.673 187122 DEBUG nova.virt.libvirt.vif [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:30:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1381207947',display_name='tempest-TestNetworkBasicOps-server-1381207947',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1381207947',id=2,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKz1zKAjc+yf5li4da8XIvqWDXRuTDpiaslTgmqHXzBGz94VFVJhUaQqgWe3b+EYB3Rh82kd8dRo5mKSkVT1NTEeg5LwcEe3f+FXePfWPOOSqrhT5QwRKjg0U9ej6crFZQ==',key_name='tempest-TestNetworkBasicOps-42396895',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:30:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-xxy4lmcs',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:30:19Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=992fc509-2c64-4e90-8c91-9e657e37b9c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.674 187122 DEBUG nova.network.os_vif_util [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "bdac8846-16d8-4956-86de-7562233b3a16", "address": "fa:16:3e:47:2c:3f", "network": {"id": "5001df36-3c56-487d-82eb-e715c1547595", "bridge": "br-int", "label": "tempest-network-smoke--479301746", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbdac8846-16", "ovs_interfaceid": "bdac8846-16d8-4956-86de-7562233b3a16", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.675 187122 DEBUG nova.network.os_vif_util [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.676 187122 DEBUG os_vif [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.678 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.679 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbdac8846-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.685 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.690 187122 INFO os_vif [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:2c:3f,bridge_name='br-int',has_traffic_filtering=True,id=bdac8846-16d8-4956-86de-7562233b3a16,network=Network(5001df36-3c56-487d-82eb-e715c1547595),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbdac8846-16')
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.691 187122 INFO nova.virt.libvirt.driver [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Deleting instance files /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1_del
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.692 187122 INFO nova.virt.libvirt.driver [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Deletion of /var/lib/nova/instances/992fc509-2c64-4e90-8c91-9e657e37b9c1_del complete
Nov 24 14:30:42 compute-0 systemd[1]: libpod-conmon-3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0.scope: Deactivated successfully.
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.763 187122 DEBUG nova.virt.libvirt.host [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.764 187122 INFO nova.virt.libvirt.host [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] UEFI support detected
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.767 187122 INFO nova.compute.manager [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.768 187122 DEBUG oslo.service.loopingcall [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.768 187122 DEBUG nova.compute.manager [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.768 187122 DEBUG nova.network.neutron [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:30:42 compute-0 podman[214072]: 2025-11-24 14:30:42.771551901 +0000 UTC m=+0.059783532 container remove 3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.777 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b408ba-2702-439a-81ba-967053525d86]: (4, ('Mon Nov 24 02:30:42 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595 (3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0)\n3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0\nMon Nov 24 02:30:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5001df36-3c56-487d-82eb-e715c1547595 (3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0)\n3af47845e71b0464dd68c44ec06795295539c6521e6c02bc589fd3fbc0021aa0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.779 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[072a6d21-bf7b-40bf-8d3b-8d311cc93b05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.780 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5001df36-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.782 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 kernel: tap5001df36-30: left promiscuous mode
Nov 24 14:30:42 compute-0 nova_compute[187118]: 2025-11-24 14:30:42.798 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.801 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d29918-8450-421e-be29-214d2c781f5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.827 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[af67e2f3-6dab-41fb-8445-64198a161b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.829 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f8342b19-aae6-4447-ab6f-39e90aef95db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.850 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[47497f33-bd49-4795-8c15-f99d727dc626]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 288583, 'reachable_time': 16374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214089, 'error': None, 'target': 'ovnmeta-5001df36-3c56-487d-82eb-e715c1547595', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:42 compute-0 systemd[1]: run-netns-ovnmeta\x2d5001df36\x2d3c56\x2d487d\x2d82eb\x2de715c1547595.mount: Deactivated successfully.
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.863 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5001df36-3c56-487d-82eb-e715c1547595 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:30:42 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:42.865 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8be81607-f796-4425-bea0-376796384b35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.330 187122 DEBUG nova.compute.manager [req-604b1bb3-972a-4191-8b02-64f1240136dc req-46fe0168-f075-4c9e-9853-172327a1a040 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-vif-unplugged-bdac8846-16d8-4956-86de-7562233b3a16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.331 187122 DEBUG oslo_concurrency.lockutils [req-604b1bb3-972a-4191-8b02-64f1240136dc req-46fe0168-f075-4c9e-9853-172327a1a040 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.331 187122 DEBUG oslo_concurrency.lockutils [req-604b1bb3-972a-4191-8b02-64f1240136dc req-46fe0168-f075-4c9e-9853-172327a1a040 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.331 187122 DEBUG oslo_concurrency.lockutils [req-604b1bb3-972a-4191-8b02-64f1240136dc req-46fe0168-f075-4c9e-9853-172327a1a040 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.331 187122 DEBUG nova.compute.manager [req-604b1bb3-972a-4191-8b02-64f1240136dc req-46fe0168-f075-4c9e-9853-172327a1a040 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] No waiting events found dispatching network-vif-unplugged-bdac8846-16d8-4956-86de-7562233b3a16 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.332 187122 DEBUG nova.compute.manager [req-604b1bb3-972a-4191-8b02-64f1240136dc req-46fe0168-f075-4c9e-9853-172327a1a040 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-vif-unplugged-bdac8846-16d8-4956-86de-7562233b3a16 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.528 187122 DEBUG nova.network.neutron [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.548 187122 INFO nova.compute.manager [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Took 1.78 seconds to deallocate network for instance.
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.590 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.591 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.683 187122 DEBUG nova.compute.provider_tree [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.696 187122 DEBUG nova.scheduler.client.report [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.710 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.732 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.738 187122 INFO nova.scheduler.client.report [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 992fc509-2c64-4e90-8c91-9e657e37b9c1
Nov 24 14:30:44 compute-0 nova_compute[187118]: 2025-11-24 14:30:44.793 187122 DEBUG oslo_concurrency.lockutils [None req-4bdfe8f9-b451-4e0b-88d5-9163c2a888bc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.401s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.427 187122 DEBUG nova.compute.manager [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.427 187122 DEBUG oslo_concurrency.lockutils [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.428 187122 DEBUG oslo_concurrency.lockutils [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.428 187122 DEBUG oslo_concurrency.lockutils [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "992fc509-2c64-4e90-8c91-9e657e37b9c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.428 187122 DEBUG nova.compute.manager [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] No waiting events found dispatching network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.429 187122 WARNING nova.compute.manager [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received unexpected event network-vif-plugged-bdac8846-16d8-4956-86de-7562233b3a16 for instance with vm_state deleted and task_state None.
Nov 24 14:30:46 compute-0 nova_compute[187118]: 2025-11-24 14:30:46.429 187122 DEBUG nova.compute.manager [req-a3cf69c9-92f2-4102-92fa-bd9b4a6654d3 req-4d0e83fe-220e-4b6f-9710-593bc92c52f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Received event network-vif-deleted-bdac8846-16d8-4956-86de-7562233b3a16 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:47 compute-0 ovn_controller[95613]: 2025-11-24T14:30:47Z|00042|binding|INFO|Releasing lport 3422fe53-fb68-45d5-b5e6-00b6f3797166 from this chassis (sb_readonly=0)
Nov 24 14:30:47 compute-0 podman[214091]: 2025-11-24 14:30:47.518304906 +0000 UTC m=+0.116745737 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Nov 24 14:30:47 compute-0 nova_compute[187118]: 2025-11-24 14:30:47.562 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:47 compute-0 nova_compute[187118]: 2025-11-24 14:30:47.682 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.503 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.504 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.505 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.505 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.506 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.508 187122 INFO nova.compute.manager [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Terminating instance
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.510 187122 DEBUG nova.compute.manager [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:30:48 compute-0 podman[214112]: 2025-11-24 14:30:48.525811306 +0000 UTC m=+0.121957168 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 14:30:48 compute-0 kernel: tapaba4af13-ce (unregistering): left promiscuous mode
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.540 187122 DEBUG nova.compute.manager [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-changed-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.541 187122 DEBUG nova.compute.manager [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Refreshing instance network info cache due to event network-changed-aba4af13-ceac-4d72-af85-e39af5aec20c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:30:48 compute-0 NetworkManager[55697]: <info>  [1763994648.5419] device (tapaba4af13-ce): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.542 187122 DEBUG oslo_concurrency.lockutils [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.542 187122 DEBUG oslo_concurrency.lockutils [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.543 187122 DEBUG nova.network.neutron [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Refreshing network info cache for port aba4af13-ceac-4d72-af85-e39af5aec20c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:30:48 compute-0 ovn_controller[95613]: 2025-11-24T14:30:48Z|00043|binding|INFO|Releasing lport aba4af13-ceac-4d72-af85-e39af5aec20c from this chassis (sb_readonly=0)
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.550 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 ovn_controller[95613]: 2025-11-24T14:30:48Z|00044|binding|INFO|Setting lport aba4af13-ceac-4d72-af85-e39af5aec20c down in Southbound
Nov 24 14:30:48 compute-0 ovn_controller[95613]: 2025-11-24T14:30:48Z|00045|binding|INFO|Removing iface tapaba4af13-ce ovn-installed in OVS
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.554 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.560 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:a2:ef 10.100.0.14'], port_security=['fa:16:3e:25:a2:ef 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'bb69573d-afb8-4ab1-833e-04ae871dcad7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5f5ab2be-3ace-4e21-972d-0f2a6aba47d5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fe482998-02d7-4d4f-bc96-de688bc3ae29, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=aba4af13-ceac-4d72-af85-e39af5aec20c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.561 104469 INFO neutron.agent.ovn.metadata.agent [-] Port aba4af13-ceac-4d72-af85-e39af5aec20c in datapath ef00357c-8383-4ce4-bb83-80ee7be7b5b1 unbound from our chassis
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.563 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef00357c-8383-4ce4-bb83-80ee7be7b5b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.567 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7c7c21-18a6-40fc-ae47-f24aafe0760f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.568 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1 namespace which is not needed anymore
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.577 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Nov 24 14:30:48 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.329s CPU time.
Nov 24 14:30:48 compute-0 systemd-machined[153483]: Machine qemu-1-instance-00000001 terminated.
Nov 24 14:30:48 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [NOTICE]   (213519) : haproxy version is 2.8.14-c23fe91
Nov 24 14:30:48 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [NOTICE]   (213519) : path to executable is /usr/sbin/haproxy
Nov 24 14:30:48 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [WARNING]  (213519) : Exiting Master process...
Nov 24 14:30:48 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [WARNING]  (213519) : Exiting Master process...
Nov 24 14:30:48 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [ALERT]    (213519) : Current worker (213521) exited with code 143 (Terminated)
Nov 24 14:30:48 compute-0 neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1[213515]: [WARNING]  (213519) : All workers exited. Exiting... (0)
Nov 24 14:30:48 compute-0 systemd[1]: libpod-4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15.scope: Deactivated successfully.
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.729 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 podman[214163]: 2025-11-24 14:30:48.731832053 +0000 UTC m=+0.050029728 container died 4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118)
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.734 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15-userdata-shm.mount: Deactivated successfully.
Nov 24 14:30:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-8d06d07b21e7561b93d250c31a34b7471b9570511e8582a5c05972ba91cbf210-merged.mount: Deactivated successfully.
Nov 24 14:30:48 compute-0 podman[214163]: 2025-11-24 14:30:48.760338165 +0000 UTC m=+0.078535840 container cleanup 4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.769 187122 INFO nova.virt.libvirt.driver [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Instance destroyed successfully.
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.770 187122 DEBUG nova.objects.instance [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid bb69573d-afb8-4ab1-833e-04ae871dcad7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:30:48 compute-0 systemd[1]: libpod-conmon-4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15.scope: Deactivated successfully.
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.780 187122 DEBUG nova.virt.libvirt.vif [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:29:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1032848293',display_name='tempest-TestNetworkBasicOps-server-1032848293',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1032848293',id=1,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAg+4WDbs9dvf+ZfMdzWe+3frfT2iNST3PznRg9a5JPQ5k9XOSE1hSYOU22Jk4o7BXQkELAaKL7sRamVWrjqRPyn6FTi2nPDS7facyBL0RDXEHXBjDvsTuuOIpe2dedRLA==',key_name='tempest-TestNetworkBasicOps-2128408494',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:29:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-t0qg6gl6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:29:42Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bb69573d-afb8-4ab1-833e-04ae871dcad7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.781 187122 DEBUG nova.network.os_vif_util [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.781 187122 DEBUG nova.network.os_vif_util [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.782 187122 DEBUG os_vif [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.783 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.783 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaba4af13-ce, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.784 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.785 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.787 187122 INFO os_vif [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:25:a2:ef,bridge_name='br-int',has_traffic_filtering=True,id=aba4af13-ceac-4d72-af85-e39af5aec20c,network=Network(ef00357c-8383-4ce4-bb83-80ee7be7b5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaba4af13-ce')
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.787 187122 INFO nova.virt.libvirt.driver [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Deleting instance files /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7_del
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.788 187122 INFO nova.virt.libvirt.driver [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Deletion of /var/lib/nova/instances/bb69573d-afb8-4ab1-833e-04ae871dcad7_del complete
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.825 187122 INFO nova.compute.manager [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Took 0.31 seconds to destroy the instance on the hypervisor.
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.826 187122 DEBUG oslo.service.loopingcall [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.826 187122 DEBUG nova.compute.manager [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.826 187122 DEBUG nova.network.neutron [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:30:48 compute-0 podman[214206]: 2025-11-24 14:30:48.834880627 +0000 UTC m=+0.045474195 container remove 4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.841 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a6e883-b14a-4785-b999-03fc1f7ce243]: (4, ('Mon Nov 24 02:30:48 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1 (4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15)\n4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15\nMon Nov 24 02:30:48 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1 (4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15)\n4a1b7cb5ec0001ae11ab2ce33640c62baa19384a9b338d0e2312f7e939bc2e15\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.843 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1914f384-d339-4487-b9cc-3ddb2b62257f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.844 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef00357c-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.845 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 kernel: tapef00357c-80: left promiscuous mode
Nov 24 14:30:48 compute-0 nova_compute[187118]: 2025-11-24 14:30:48.859 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.863 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[200e64c2-b37b-44ee-9710-376e38619b3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.885 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e018e9d9-5b49-4bb0-a4a4-13ca38de3f46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.886 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[67ac4d3a-e0db-4188-9536-592316b34e07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.905 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7111a0ae-936a-4013-855a-dd4d5014b976]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 285272, 'reachable_time': 32199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214221, 'error': None, 'target': 'ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.908 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef00357c-8383-4ce4-bb83-80ee7be7b5b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:30:48 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:48.908 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[51cf81a5-f8a5-4ecb-8fc5-d7ee92261d2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:30:48 compute-0 systemd[1]: run-netns-ovnmeta\x2def00357c\x2d8383\x2d4ce4\x2dbb83\x2d80ee7be7b5b1.mount: Deactivated successfully.
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.439 187122 DEBUG nova.compute.manager [req-c1d5cc22-6f88-42f1-ad65-545efb1a5523 req-bce9306d-47df-4813-9ea6-2c6d5135e8ea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-vif-unplugged-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.440 187122 DEBUG oslo_concurrency.lockutils [req-c1d5cc22-6f88-42f1-ad65-545efb1a5523 req-bce9306d-47df-4813-9ea6-2c6d5135e8ea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.440 187122 DEBUG oslo_concurrency.lockutils [req-c1d5cc22-6f88-42f1-ad65-545efb1a5523 req-bce9306d-47df-4813-9ea6-2c6d5135e8ea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.440 187122 DEBUG oslo_concurrency.lockutils [req-c1d5cc22-6f88-42f1-ad65-545efb1a5523 req-bce9306d-47df-4813-9ea6-2c6d5135e8ea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.441 187122 DEBUG nova.compute.manager [req-c1d5cc22-6f88-42f1-ad65-545efb1a5523 req-bce9306d-47df-4813-9ea6-2c6d5135e8ea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] No waiting events found dispatching network-vif-unplugged-aba4af13-ceac-4d72-af85-e39af5aec20c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.441 187122 DEBUG nova.compute.manager [req-c1d5cc22-6f88-42f1-ad65-545efb1a5523 req-bce9306d-47df-4813-9ea6-2c6d5135e8ea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-vif-unplugged-aba4af13-ceac-4d72-af85-e39af5aec20c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.696 187122 DEBUG nova.network.neutron [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.708 187122 INFO nova.compute.manager [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Took 0.88 seconds to deallocate network for instance.
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.735 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.743 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.743 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.807 187122 DEBUG nova.compute.provider_tree [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.820 187122 DEBUG nova.scheduler.client.report [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.845 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.876 187122 INFO nova.scheduler.client.report [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance bb69573d-afb8-4ab1-833e-04ae871dcad7
Nov 24 14:30:49 compute-0 nova_compute[187118]: 2025-11-24 14:30:49.948 187122 DEBUG oslo_concurrency.lockutils [None req-9f5e12e0-f382-478d-b268-f62be653b9c0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.443s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:50 compute-0 nova_compute[187118]: 2025-11-24 14:30:50.085 187122 DEBUG nova.network.neutron [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updated VIF entry in instance network info cache for port aba4af13-ceac-4d72-af85-e39af5aec20c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:30:50 compute-0 nova_compute[187118]: 2025-11-24 14:30:50.086 187122 DEBUG nova.network.neutron [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Updating instance_info_cache with network_info: [{"id": "aba4af13-ceac-4d72-af85-e39af5aec20c", "address": "fa:16:3e:25:a2:ef", "network": {"id": "ef00357c-8383-4ce4-bb83-80ee7be7b5b1", "bridge": "br-int", "label": "tempest-network-smoke--408460771", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaba4af13-ce", "ovs_interfaceid": "aba4af13-ceac-4d72-af85-e39af5aec20c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:30:50 compute-0 nova_compute[187118]: 2025-11-24 14:30:50.105 187122 DEBUG oslo_concurrency.lockutils [req-11979aca-654a-46d1-a712-0d577c3939e2 req-52eab0cf-4556-4b47-86f7-bcbf9935bf83 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bb69573d-afb8-4ab1-833e-04ae871dcad7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.513 187122 DEBUG nova.compute.manager [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.513 187122 DEBUG oslo_concurrency.lockutils [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.514 187122 DEBUG oslo_concurrency.lockutils [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.514 187122 DEBUG oslo_concurrency.lockutils [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bb69573d-afb8-4ab1-833e-04ae871dcad7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.514 187122 DEBUG nova.compute.manager [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] No waiting events found dispatching network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.514 187122 WARNING nova.compute.manager [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received unexpected event network-vif-plugged-aba4af13-ceac-4d72-af85-e39af5aec20c for instance with vm_state deleted and task_state None.
Nov 24 14:30:51 compute-0 nova_compute[187118]: 2025-11-24 14:30:51.514 187122 DEBUG nova.compute.manager [req-6dcb242b-3d6d-4ac3-a924-3398790b85b3 req-a725cd70-c98f-4301-8b33-8bc25bcdaeb1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Received event network-vif-deleted-aba4af13-ceac-4d72-af85-e39af5aec20c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:30:52 compute-0 podman[214223]: 2025-11-24 14:30:52.484674074 +0000 UTC m=+0.098589654 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:30:53 compute-0 nova_compute[187118]: 2025-11-24 14:30:53.787 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:54 compute-0 nova_compute[187118]: 2025-11-24 14:30:54.344 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:54 compute-0 nova_compute[187118]: 2025-11-24 14:30:54.436 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:54 compute-0 nova_compute[187118]: 2025-11-24 14:30:54.736 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:56.657 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:30:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:56.658 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:30:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:30:56.658 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:30:57 compute-0 nova_compute[187118]: 2025-11-24 14:30:57.659 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994642.6566699, 992fc509-2c64-4e90-8c91-9e657e37b9c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:30:57 compute-0 nova_compute[187118]: 2025-11-24 14:30:57.661 187122 INFO nova.compute.manager [-] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] VM Stopped (Lifecycle Event)
Nov 24 14:30:57 compute-0 nova_compute[187118]: 2025-11-24 14:30:57.677 187122 DEBUG nova.compute.manager [None req-f9d798fa-9fe1-411b-b999-b2f435c2fb1c - - - - - -] [instance: 992fc509-2c64-4e90-8c91-9e657e37b9c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:30:58 compute-0 nova_compute[187118]: 2025-11-24 14:30:58.792 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:30:59 compute-0 nova_compute[187118]: 2025-11-24 14:30:59.739 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:03 compute-0 nova_compute[187118]: 2025-11-24 14:31:03.769 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994648.7674422, bb69573d-afb8-4ab1-833e-04ae871dcad7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:31:03 compute-0 nova_compute[187118]: 2025-11-24 14:31:03.770 187122 INFO nova.compute.manager [-] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] VM Stopped (Lifecycle Event)
Nov 24 14:31:03 compute-0 nova_compute[187118]: 2025-11-24 14:31:03.785 187122 DEBUG nova.compute.manager [None req-c25b58de-8cf6-4434-93c0-7545bcb718fd - - - - - -] [instance: bb69573d-afb8-4ab1-833e-04ae871dcad7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:31:03 compute-0 nova_compute[187118]: 2025-11-24 14:31:03.795 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.305 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.306 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.327 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.411 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.412 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.422 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.422 187122 INFO nova.compute.claims [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:31:04 compute-0 podman[214250]: 2025-11-24 14:31:04.492158354 +0000 UTC m=+0.088576774 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.532 187122 DEBUG nova.compute.provider_tree [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.545 187122 DEBUG nova.scheduler.client.report [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.567 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.568 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.631 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.632 187122 DEBUG nova.network.neutron [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.741 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.773 187122 INFO nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.787 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.955 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.956 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.956 187122 INFO nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Creating image(s)
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.957 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.957 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.958 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:04 compute-0 nova_compute[187118]: 2025-11-24 14:31:04.974 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.068 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.069 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.069 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.084 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.178 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.179 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.214 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.215 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.216 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.268 187122 DEBUG nova.policy [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.284 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.285 187122 DEBUG nova.virt.disk.api [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.286 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.354 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.355 187122 DEBUG nova.virt.disk.api [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.356 187122 DEBUG nova.objects.instance [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid bbac6879-7cbf-4cf3-a37f-eefb9329007d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.367 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.367 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Ensure instance console log exists: /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.368 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.368 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:05 compute-0 nova_compute[187118]: 2025-11-24 14:31:05.368 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.099 187122 DEBUG nova.network.neutron [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Successfully created port: d569200b-51d5-4c6d-bc10-80fa732cc80e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.768 187122 DEBUG nova.network.neutron [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Successfully updated port: d569200b-51d5-4c6d-bc10-80fa732cc80e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.786 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.786 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.787 187122 DEBUG nova.network.neutron [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.833 187122 DEBUG nova.compute.manager [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-changed-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.833 187122 DEBUG nova.compute.manager [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing instance network info cache due to event network-changed-d569200b-51d5-4c6d-bc10-80fa732cc80e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.834 187122 DEBUG oslo_concurrency.lockutils [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:06 compute-0 nova_compute[187118]: 2025-11-24 14:31:06.907 187122 DEBUG nova.network.neutron [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.719 187122 DEBUG nova.network.neutron [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.738 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.739 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Instance network_info: |[{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.740 187122 DEBUG oslo_concurrency.lockutils [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.741 187122 DEBUG nova.network.neutron [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing network info cache for port d569200b-51d5-4c6d-bc10-80fa732cc80e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.747 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Start _get_guest_xml network_info=[{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.756 187122 WARNING nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.768 187122 DEBUG nova.virt.libvirt.host [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.769 187122 DEBUG nova.virt.libvirt.host [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.775 187122 DEBUG nova.virt.libvirt.host [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.775 187122 DEBUG nova.virt.libvirt.host [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.776 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.776 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.777 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.777 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.777 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.778 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.778 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.778 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.779 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.779 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.779 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.779 187122 DEBUG nova.virt.hardware [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.783 187122 DEBUG nova.virt.libvirt.vif [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:31:04Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.784 187122 DEBUG nova.network.os_vif_util [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.785 187122 DEBUG nova.network.os_vif_util [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.786 187122 DEBUG nova.objects.instance [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid bbac6879-7cbf-4cf3-a37f-eefb9329007d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.798 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <uuid>bbac6879-7cbf-4cf3-a37f-eefb9329007d</uuid>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <name>instance-00000003</name>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-522280633</nova:name>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:31:07</nova:creationTime>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         <nova:port uuid="d569200b-51d5-4c6d-bc10-80fa732cc80e">
Nov 24 14:31:07 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <system>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <entry name="serial">bbac6879-7cbf-4cf3-a37f-eefb9329007d</entry>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <entry name="uuid">bbac6879-7cbf-4cf3-a37f-eefb9329007d</entry>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </system>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <os>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </os>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <features>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </features>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.config"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:f2:4f:c4"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <target dev="tapd569200b-51"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/console.log" append="off"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <video>
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </video>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:31:07 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:31:07 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:31:07 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:31:07 compute-0 nova_compute[187118]: </domain>
Nov 24 14:31:07 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.799 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Preparing to wait for external event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.799 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.800 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.800 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.801 187122 DEBUG nova.virt.libvirt.vif [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:31:04Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.801 187122 DEBUG nova.network.os_vif_util [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.802 187122 DEBUG nova.network.os_vif_util [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.802 187122 DEBUG os_vif [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.803 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.803 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.804 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.809 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.809 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd569200b-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.809 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd569200b-51, col_values=(('external_ids', {'iface-id': 'd569200b-51d5-4c6d-bc10-80fa732cc80e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:4f:c4', 'vm-uuid': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.811 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:07 compute-0 NetworkManager[55697]: <info>  [1763994667.8128] manager: (tapd569200b-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.813 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.822 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.824 187122 INFO os_vif [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51')
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.883 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.884 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.884 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:f2:4f:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:31:07 compute-0 nova_compute[187118]: 2025-11-24 14:31:07.885 187122 INFO nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Using config drive
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.333 187122 INFO nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Creating config drive at /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.config
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.342 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf_dsnwok execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.469 187122 DEBUG oslo_concurrency.processutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf_dsnwok" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:08 compute-0 podman[214295]: 2025-11-24 14:31:08.480315025 +0000 UTC m=+0.079024508 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:31:08 compute-0 kernel: tapd569200b-51: entered promiscuous mode
Nov 24 14:31:08 compute-0 NetworkManager[55697]: <info>  [1763994668.5546] manager: (tapd569200b-51): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Nov 24 14:31:08 compute-0 ovn_controller[95613]: 2025-11-24T14:31:08Z|00046|binding|INFO|Claiming lport d569200b-51d5-4c6d-bc10-80fa732cc80e for this chassis.
Nov 24 14:31:08 compute-0 ovn_controller[95613]: 2025-11-24T14:31:08Z|00047|binding|INFO|d569200b-51d5-4c6d-bc10-80fa732cc80e: Claiming fa:16:3e:f2:4f:c4 10.100.0.4
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.556 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.570 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:4f:c4 10.100.0.4'], port_security=['fa:16:3e:f2:4f:c4 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38ed537e-137f-4008-8b5e-205116f17c56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b82a4a1e-b6e8-4ba4-a10d-838313a32f94', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12ae9689-a2d9-4eaf-8813-e59915c1ea74, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=d569200b-51d5-4c6d-bc10-80fa732cc80e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.572 104469 INFO neutron.agent.ovn.metadata.agent [-] Port d569200b-51d5-4c6d-bc10-80fa732cc80e in datapath 38ed537e-137f-4008-8b5e-205116f17c56 bound to our chassis
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.574 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 38ed537e-137f-4008-8b5e-205116f17c56
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.588 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1564f173-405b-4aec-bbc3-2088879ed912]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.591 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap38ed537e-11 in ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.594 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap38ed537e-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.594 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d2294530-5adc-4df4-afe4-d114b6f424d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.595 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4ba9ab7b-ac3b-43e7-8e3d-725a0ca44ba7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 systemd-udevd[214331]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.610 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[8c89bbf8-0afd-47c6-9d15-f4b1b9ff97c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 systemd-machined[153483]: New machine qemu-3-instance-00000003.
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.612 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:08 compute-0 ovn_controller[95613]: 2025-11-24T14:31:08Z|00048|binding|INFO|Setting lport d569200b-51d5-4c6d-bc10-80fa732cc80e ovn-installed in OVS
Nov 24 14:31:08 compute-0 ovn_controller[95613]: 2025-11-24T14:31:08Z|00049|binding|INFO|Setting lport d569200b-51d5-4c6d-bc10-80fa732cc80e up in Southbound
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.617 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:08 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Nov 24 14:31:08 compute-0 NetworkManager[55697]: <info>  [1763994668.6352] device (tapd569200b-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:31:08 compute-0 NetworkManager[55697]: <info>  [1763994668.6362] device (tapd569200b-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.638 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c865d02d-a4da-4d12-a2b2-0716dc98b168]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.667 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[89a930df-11ad-4699-8b8f-0d7f854d9d5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 systemd-udevd[214334]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:31:08 compute-0 NetworkManager[55697]: <info>  [1763994668.6760] manager: (tap38ed537e-10): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.675 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4bf23384-4e89-45ae-b2a6-c693dc53b92c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.705 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[1b7f5f53-8ad2-4d1e-ae8e-947c61fe42c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.707 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[7282a095-3a72-401e-b1fb-4af540b93407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 NetworkManager[55697]: <info>  [1763994668.7293] device (tap38ed537e-10): carrier: link connected
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.734 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[efedc166-dbce-4b6f-a502-a35e2349f993]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.754 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c323fcc0-2bad-46fa-b91c-61e0fc87c789]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap38ed537e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:0b:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 293616, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214362, 'error': None, 'target': 'ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.772 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[9136e122-0eb8-4c28-9736-576912e0c705]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe28:b69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 293616, 'tstamp': 293616}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214363, 'error': None, 'target': 'ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.787 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[dafa6999-ef0a-4e08-b947-dd1604c7e784]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap38ed537e-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:28:0b:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 293616, 'reachable_time': 35716, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214364, 'error': None, 'target': 'ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.817 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1891c2a6-3c7e-48df-b02b-a20fe5bf5b49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.822 187122 DEBUG nova.network.neutron [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updated VIF entry in instance network info cache for port d569200b-51d5-4c6d-bc10-80fa732cc80e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.822 187122 DEBUG nova.network.neutron [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.839 187122 DEBUG oslo_concurrency.lockutils [req-9b2861ab-f599-4bae-9759-ff10a26f8475 req-8fab092a-c739-4bba-9896-e007827040b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.879 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7446ef0e-6f30-4aa8-b075-e433c49bd1cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.881 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38ed537e-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.881 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.881 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38ed537e-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:08 compute-0 NetworkManager[55697]: <info>  [1763994668.8841] manager: (tap38ed537e-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Nov 24 14:31:08 compute-0 kernel: tap38ed537e-10: entered promiscuous mode
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.884 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.885 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap38ed537e-10, col_values=(('external_ids', {'iface-id': 'cfedc122-e1c6-453f-b0a9-daf44fc138ee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:08 compute-0 ovn_controller[95613]: 2025-11-24T14:31:08Z|00050|binding|INFO|Releasing lport cfedc122-e1c6-453f-b0a9-daf44fc138ee from this chassis (sb_readonly=0)
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.897 187122 DEBUG nova.compute.manager [req-a8f66ea4-3861-4bcc-a1d1-20aebb027e20 req-b35326d0-c6ee-4791-b6d5-95b4000520ec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.898 187122 DEBUG oslo_concurrency.lockutils [req-a8f66ea4-3861-4bcc-a1d1-20aebb027e20 req-b35326d0-c6ee-4791-b6d5-95b4000520ec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.899 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/38ed537e-137f-4008-8b5e-205116f17c56.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/38ed537e-137f-4008-8b5e-205116f17c56.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.900 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[46eeda20-be5e-4973-988b-3a8bfc766382]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.899 187122 DEBUG oslo_concurrency.lockutils [req-a8f66ea4-3861-4bcc-a1d1-20aebb027e20 req-b35326d0-c6ee-4791-b6d5-95b4000520ec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.900 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-38ed537e-137f-4008-8b5e-205116f17c56
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/38ed537e-137f-4008-8b5e-205116f17c56.pid.haproxy
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID 38ed537e-137f-4008-8b5e-205116f17c56
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.900 187122 DEBUG oslo_concurrency.lockutils [req-a8f66ea4-3861-4bcc-a1d1-20aebb027e20 req-b35326d0-c6ee-4791-b6d5-95b4000520ec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.901 187122 DEBUG nova.compute.manager [req-a8f66ea4-3861-4bcc-a1d1-20aebb027e20 req-b35326d0-c6ee-4791-b6d5-95b4000520ec 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Processing event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.902 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:08.902 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56', 'env', 'PROCESS_TAG=haproxy-38ed537e-137f-4008-8b5e-205116f17c56', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/38ed537e-137f-4008-8b5e-205116f17c56.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.918 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994668.9180055, bbac6879-7cbf-4cf3-a37f-eefb9329007d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.919 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] VM Started (Lifecycle Event)
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.923 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.927 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.930 187122 INFO nova.virt.libvirt.driver [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Instance spawned successfully.
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.930 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.934 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.939 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.946 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.946 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.947 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.947 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.948 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.948 187122 DEBUG nova.virt.libvirt.driver [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.952 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.953 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994668.9192185, bbac6879-7cbf-4cf3-a37f-eefb9329007d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.953 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] VM Paused (Lifecycle Event)
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.974 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.977 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994668.9261045, bbac6879-7cbf-4cf3-a37f-eefb9329007d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.977 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] VM Resumed (Lifecycle Event)
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.993 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:31:08 compute-0 nova_compute[187118]: 2025-11-24 14:31:08.998 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:31:09 compute-0 nova_compute[187118]: 2025-11-24 14:31:09.003 187122 INFO nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Took 4.05 seconds to spawn the instance on the hypervisor.
Nov 24 14:31:09 compute-0 nova_compute[187118]: 2025-11-24 14:31:09.004 187122 DEBUG nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:31:09 compute-0 nova_compute[187118]: 2025-11-24 14:31:09.014 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:31:09 compute-0 nova_compute[187118]: 2025-11-24 14:31:09.068 187122 INFO nova.compute.manager [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Took 4.69 seconds to build instance.
Nov 24 14:31:09 compute-0 nova_compute[187118]: 2025-11-24 14:31:09.080 187122 DEBUG oslo_concurrency.lockutils [None req-e9403d6d-ff08-4e06-a483-dd09f405d325 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:09 compute-0 podman[214403]: 2025-11-24 14:31:09.313013588 +0000 UTC m=+0.062090835 container create ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 14:31:09 compute-0 podman[214403]: 2025-11-24 14:31:09.27373884 +0000 UTC m=+0.022816097 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:31:09 compute-0 systemd[1]: Started libpod-conmon-ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb.scope.
Nov 24 14:31:09 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:31:09 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/221bc654e95982cda48f23ba2df3333c320af8c28a90fb4bddc723126d353e9d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:31:09 compute-0 podman[214403]: 2025-11-24 14:31:09.410501068 +0000 UTC m=+0.159578325 container init ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 24 14:31:09 compute-0 podman[214403]: 2025-11-24 14:31:09.415505207 +0000 UTC m=+0.164582444 container start ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:31:09 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [NOTICE]   (214422) : New worker (214424) forked
Nov 24 14:31:09 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [NOTICE]   (214422) : Loading success.
Nov 24 14:31:09 compute-0 nova_compute[187118]: 2025-11-24 14:31:09.743 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:10 compute-0 nova_compute[187118]: 2025-11-24 14:31:10.975 187122 DEBUG nova.compute.manager [req-06a37d50-2cf8-419a-ba96-b4510928184b req-ef9cd902-5e6e-4265-bf4f-4d3d41dc34e4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:10 compute-0 nova_compute[187118]: 2025-11-24 14:31:10.975 187122 DEBUG oslo_concurrency.lockutils [req-06a37d50-2cf8-419a-ba96-b4510928184b req-ef9cd902-5e6e-4265-bf4f-4d3d41dc34e4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:10 compute-0 nova_compute[187118]: 2025-11-24 14:31:10.975 187122 DEBUG oslo_concurrency.lockutils [req-06a37d50-2cf8-419a-ba96-b4510928184b req-ef9cd902-5e6e-4265-bf4f-4d3d41dc34e4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:10 compute-0 nova_compute[187118]: 2025-11-24 14:31:10.976 187122 DEBUG oslo_concurrency.lockutils [req-06a37d50-2cf8-419a-ba96-b4510928184b req-ef9cd902-5e6e-4265-bf4f-4d3d41dc34e4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:10 compute-0 nova_compute[187118]: 2025-11-24 14:31:10.976 187122 DEBUG nova.compute.manager [req-06a37d50-2cf8-419a-ba96-b4510928184b req-ef9cd902-5e6e-4265-bf4f-4d3d41dc34e4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:10 compute-0 nova_compute[187118]: 2025-11-24 14:31:10.976 187122 WARNING nova.compute.manager [req-06a37d50-2cf8-419a-ba96-b4510928184b req-ef9cd902-5e6e-4265-bf4f-4d3d41dc34e4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received unexpected event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e for instance with vm_state active and task_state None.
Nov 24 14:31:12 compute-0 podman[214434]: 2025-11-24 14:31:12.457957527 +0000 UTC m=+0.063259843 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 14:31:12 compute-0 podman[214433]: 2025-11-24 14:31:12.459992489 +0000 UTC m=+0.063648604 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 24 14:31:12 compute-0 nova_compute[187118]: 2025-11-24 14:31:12.752 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:12 compute-0 NetworkManager[55697]: <info>  [1763994672.7559] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Nov 24 14:31:12 compute-0 ovn_controller[95613]: 2025-11-24T14:31:12Z|00051|binding|INFO|Releasing lport cfedc122-e1c6-453f-b0a9-daf44fc138ee from this chassis (sb_readonly=0)
Nov 24 14:31:12 compute-0 NetworkManager[55697]: <info>  [1763994672.7573] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 24 14:31:12 compute-0 ovn_controller[95613]: 2025-11-24T14:31:12Z|00052|binding|INFO|Releasing lport cfedc122-e1c6-453f-b0a9-daf44fc138ee from this chassis (sb_readonly=0)
Nov 24 14:31:12 compute-0 nova_compute[187118]: 2025-11-24 14:31:12.774 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:12 compute-0 nova_compute[187118]: 2025-11-24 14:31:12.780 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:12 compute-0 nova_compute[187118]: 2025-11-24 14:31:12.811 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:13 compute-0 nova_compute[187118]: 2025-11-24 14:31:13.171 187122 DEBUG nova.compute.manager [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-changed-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:13 compute-0 nova_compute[187118]: 2025-11-24 14:31:13.171 187122 DEBUG nova.compute.manager [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing instance network info cache due to event network-changed-d569200b-51d5-4c6d-bc10-80fa732cc80e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:31:13 compute-0 nova_compute[187118]: 2025-11-24 14:31:13.171 187122 DEBUG oslo_concurrency.lockutils [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:13 compute-0 nova_compute[187118]: 2025-11-24 14:31:13.171 187122 DEBUG oslo_concurrency.lockutils [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:13 compute-0 nova_compute[187118]: 2025-11-24 14:31:13.172 187122 DEBUG nova.network.neutron [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing network info cache for port d569200b-51d5-4c6d-bc10-80fa732cc80e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:31:14 compute-0 nova_compute[187118]: 2025-11-24 14:31:14.744 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:14 compute-0 nova_compute[187118]: 2025-11-24 14:31:14.798 187122 DEBUG nova.network.neutron [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updated VIF entry in instance network info cache for port d569200b-51d5-4c6d-bc10-80fa732cc80e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:31:14 compute-0 nova_compute[187118]: 2025-11-24 14:31:14.798 187122 DEBUG nova.network.neutron [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:14 compute-0 nova_compute[187118]: 2025-11-24 14:31:14.813 187122 DEBUG oslo_concurrency.lockutils [req-7fc422e4-dce8-4b0e-af4b-179e82a4d0bc req-f5631c95-0a77-46ec-b33e-aa6ad675c8c6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:17 compute-0 nova_compute[187118]: 2025-11-24 14:31:17.813 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:18 compute-0 podman[214472]: 2025-11-24 14:31:18.473954951 +0000 UTC m=+0.083149325 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 24 14:31:19 compute-0 podman[214490]: 2025-11-24 14:31:19.51355124 +0000 UTC m=+0.115258958 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:31:19 compute-0 nova_compute[187118]: 2025-11-24 14:31:19.747 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:21 compute-0 ovn_controller[95613]: 2025-11-24T14:31:21Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:4f:c4 10.100.0.4
Nov 24 14:31:21 compute-0 ovn_controller[95613]: 2025-11-24T14:31:21Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:4f:c4 10.100.0.4
Nov 24 14:31:22 compute-0 nova_compute[187118]: 2025-11-24 14:31:22.815 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:23 compute-0 podman[214528]: 2025-11-24 14:31:23.467090343 +0000 UTC m=+0.065080920 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:31:24 compute-0 nova_compute[187118]: 2025-11-24 14:31:24.749 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:27 compute-0 nova_compute[187118]: 2025-11-24 14:31:27.613 187122 INFO nova.compute.manager [None req-53d226eb-8419-45d3-ad40-4fc73de7db41 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Get console output
Nov 24 14:31:27 compute-0 nova_compute[187118]: 2025-11-24 14:31:27.618 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:31:27 compute-0 nova_compute[187118]: 2025-11-24 14:31:27.818 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:28 compute-0 nova_compute[187118]: 2025-11-24 14:31:28.201 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:28.201 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:31:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:28.202 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:31:29 compute-0 nova_compute[187118]: 2025-11-24 14:31:29.750 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:31 compute-0 nova_compute[187118]: 2025-11-24 14:31:31.883 187122 DEBUG oslo_concurrency.lockutils [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "interface-bbac6879-7cbf-4cf3-a37f-eefb9329007d-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:31 compute-0 nova_compute[187118]: 2025-11-24 14:31:31.884 187122 DEBUG oslo_concurrency.lockutils [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-bbac6879-7cbf-4cf3-a37f-eefb9329007d-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:31 compute-0 nova_compute[187118]: 2025-11-24 14:31:31.884 187122 DEBUG nova.objects.instance [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'flavor' on Instance uuid bbac6879-7cbf-4cf3-a37f-eefb9329007d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:31:32 compute-0 nova_compute[187118]: 2025-11-24 14:31:32.394 187122 DEBUG nova.objects.instance [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_requests' on Instance uuid bbac6879-7cbf-4cf3-a37f-eefb9329007d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:31:32 compute-0 nova_compute[187118]: 2025-11-24 14:31:32.411 187122 DEBUG nova.network.neutron [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:31:32 compute-0 nova_compute[187118]: 2025-11-24 14:31:32.566 187122 DEBUG nova.policy [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:31:32 compute-0 nova_compute[187118]: 2025-11-24 14:31:32.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:32 compute-0 nova_compute[187118]: 2025-11-24 14:31:32.820 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.197 187122 DEBUG nova.network.neutron [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Successfully created port: 9fc2ca85-0d0d-4a58-9141-850ad8736a28 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.788 187122 DEBUG nova.network.neutron [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Successfully updated port: 9fc2ca85-0d0d-4a58-9141-850ad8736a28 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.807 187122 DEBUG oslo_concurrency.lockutils [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.808 187122 DEBUG oslo_concurrency.lockutils [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.808 187122 DEBUG nova.network.neutron [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.892 187122 DEBUG nova.compute.manager [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-changed-9fc2ca85-0d0d-4a58-9141-850ad8736a28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.893 187122 DEBUG nova.compute.manager [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing instance network info cache due to event network-changed-9fc2ca85-0d0d-4a58-9141-850ad8736a28. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:31:33 compute-0 nova_compute[187118]: 2025-11-24 14:31:33.893 187122 DEBUG oslo_concurrency.lockutils [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:34 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:34.203 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:34 compute-0 nova_compute[187118]: 2025-11-24 14:31:34.754 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 podman[214552]: 2025-11-24 14:31:35.484185396 +0000 UTC m=+0.082331453 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.496 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b38eba2448d56183ca526b229ab3284e63ebccfa893c0da4dc21b37b13d7f8a0" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.515 187122 DEBUG nova.network.neutron [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.547 187122 DEBUG oslo_concurrency.lockutils [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.549 187122 DEBUG oslo_concurrency.lockutils [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.550 187122 DEBUG nova.network.neutron [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing network info cache for port 9fc2ca85-0d0d-4a58-9141-850ad8736a28 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.557 187122 DEBUG nova.virt.libvirt.vif [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:31:09Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.557 187122 DEBUG nova.network.os_vif_util [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.558 187122 DEBUG nova.network.os_vif_util [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.558 187122 DEBUG os_vif [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.559 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.559 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.560 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.561 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.562 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9fc2ca85-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.562 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9fc2ca85-0d, col_values=(('external_ids', {'iface-id': '9fc2ca85-0d0d-4a58-9141-850ad8736a28', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:95:39', 'vm-uuid': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.564 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.5653] manager: (tap9fc2ca85-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.566 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.570 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.571 187122 INFO os_vif [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d')
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.571 187122 DEBUG nova.virt.libvirt.vif [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:31:09Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.572 187122 DEBUG nova.network.os_vif_util [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.572 187122 DEBUG nova.network.os_vif_util [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.576 187122 DEBUG nova.virt.libvirt.guest [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] attach device xml: <interface type="ethernet">
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <mac address="fa:16:3e:f4:95:39"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <model type="virtio"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <mtu size="1442"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <target dev="tap9fc2ca85-0d"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]: </interface>
Nov 24 14:31:35 compute-0 nova_compute[187118]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 24 14:31:35 compute-0 kernel: tap9fc2ca85-0d: entered promiscuous mode
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.593 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Mon, 24 Nov 2025 14:31:35 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-f37f97b2-ba73-4720-860e-958d9faf96aa x-openstack-request-id: req-f37f97b2-ba73-4720-860e-958d9faf96aa _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.593 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "34baed15-6f06-4a27-ac4d-b55ae027ac26", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/34baed15-6f06-4a27-ac4d-b55ae027ac26"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/34baed15-6f06-4a27-ac4d-b55ae027ac26"}]}, {"id": "6e922a91-f8b6-466b-9721-3ed72f453145", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6e922a91-f8b6-466b-9721-3ed72f453145"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6e922a91-f8b6-466b-9721-3ed72f453145"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.593 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-f37f97b2-ba73-4720-860e-958d9faf96aa request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.5942] manager: (tap9fc2ca85-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.594 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.596 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/6e922a91-f8b6-466b-9721-3ed72f453145 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b38eba2448d56183ca526b229ab3284e63ebccfa893c0da4dc21b37b13d7f8a0" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 24 14:31:35 compute-0 ovn_controller[95613]: 2025-11-24T14:31:35Z|00053|binding|INFO|Claiming lport 9fc2ca85-0d0d-4a58-9141-850ad8736a28 for this chassis.
Nov 24 14:31:35 compute-0 ovn_controller[95613]: 2025-11-24T14:31:35Z|00054|binding|INFO|9fc2ca85-0d0d-4a58-9141-850ad8736a28: Claiming fa:16:3e:f4:95:39 10.100.0.24
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.605 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:95:39 10.100.0.24'], port_security=['fa:16:3e:f4:95:39 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f252b64-9a1f-4053-a32d-0ebf97a3916b, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=9fc2ca85-0d0d-4a58-9141-850ad8736a28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.607 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 9fc2ca85-0d0d-4a58-9141-850ad8736a28 in datapath 38bb9b4f-0a51-405a-8a7d-cd3764ab691d bound to our chassis
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.609 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 38bb9b4f-0a51-405a-8a7d-cd3764ab691d
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.628 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a02013f5-75b3-408c-b3ac-e5b6c332a6b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.630 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap38bb9b4f-01 in ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.632 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap38bb9b4f-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.633 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b662e6-2a20-4c02-8936-664706940304]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.634 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[94cc2399-2d5d-4e64-af1c-2a75166885bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 systemd-udevd[214584]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.646 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[9db10dae-a6a7-4e3b-b63c-71641bc0c467]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.655 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.6580] device (tap9fc2ca85-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:31:35 compute-0 ovn_controller[95613]: 2025-11-24T14:31:35Z|00055|binding|INFO|Setting lport 9fc2ca85-0d0d-4a58-9141-850ad8736a28 ovn-installed in OVS
Nov 24 14:31:35 compute-0 ovn_controller[95613]: 2025-11-24T14:31:35Z|00056|binding|INFO|Setting lport 9fc2ca85-0d0d-4a58-9141-850ad8736a28 up in Southbound
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.6586] device (tap9fc2ca85-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.659 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.671 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Mon, 24 Nov 2025 14:31:35 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-623a1d0e-7e09-438e-938d-55cf6bc695d6 x-openstack-request-id: req-623a1d0e-7e09-438e-938d-55cf6bc695d6 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.672 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "6e922a91-f8b6-466b-9721-3ed72f453145", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6e922a91-f8b6-466b-9721-3ed72f453145"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6e922a91-f8b6-466b-9721-3ed72f453145"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.672 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/6e922a91-f8b6-466b-9721-3ed72f453145 used request id req-623a1d0e-7e09-438e-938d-55cf6bc695d6 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.675 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'name': 'tempest-TestNetworkBasicOps-server-522280633', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0b17c7cc946a4f86aea7e5b323e88562', 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'hostId': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.675 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.676 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.676 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>]
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.677 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.677 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f925d383-e272-40bc-8993-948fff3fc17a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.682 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bbac6879-7cbf-4cf3-a37f-eefb9329007d / tapd569200b-51 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.683 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for bbac6879-7cbf-4cf3-a37f-eefb9329007d / tap9fc2ca85-0d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.684 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.685 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42796af5-ccad-48fe-8d46-f857c2f7a20a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.677934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47b97ec8-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'f8cab5c3061bf882e1417bc9ac146e7a73167957f6d65e761efd0495bef0d8c6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.677934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47b99d4a-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '46940c530076cd1ae801af2486f4b7b7b443a7a471d9beb992559c8cbda46da8'}]}, 'timestamp': '2025-11-24 14:31:35.685986', '_unique_id': '6bf789ca870f458d9088c51de18abaed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.694 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.700 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.701 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.701 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c836453-c8b0-4ef4-b011-a1ef9f5da4e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.700993', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47bbfffe-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'c3d9817b5454bb98ccb7d20f9e98957ec8b72bcfe1b1139adba192d1c7a1b1fe'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.700993', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47bc1318-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'fd30f4cd00d58cf288c4c2cb60a4e453915bd93bd6d0d16b790e35109698efd7'}]}, 'timestamp': '2025-11-24 14:31:35.701962', '_unique_id': '128c6021944f4abdbc8498dd3291852f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.703 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.704 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.704 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.704 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>]
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.704 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.712 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[76bb987d-accd-4df2-bc97-7e16d205b969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.719 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f837e983-85ef-4ca9-ba02-859dd17f54d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.7204] manager: (tap38bb9b4f-00): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Nov 24 14:31:35 compute-0 systemd-udevd[214588]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.721 187122 DEBUG nova.virt.libvirt.driver [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.722 187122 DEBUG nova.virt.libvirt.driver [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.723 187122 DEBUG nova.virt.libvirt.driver [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:f2:4f:c4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.723 187122 DEBUG nova.virt.libvirt.driver [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:f4:95:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.739 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.read.latency volume: 557960000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.740 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.read.latency volume: 56217583 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cb6c536-49a6-4743-8064-c970bb134bfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 557960000, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.705088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47c1f170-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': '02a9d2f6131b638984eea1e0070fff7bbbf63cfe9cbca01796266e988785c047'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 56217583, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.705088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47c1ffda-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': '423a1d3ccbc14f38bfed4baf3c5782426c318758daf6a1402dab9247b7d2ef26'}]}, 'timestamp': '2025-11-24 14:31:35.740792', '_unique_id': '4083aa5a682f40bc8f3c24fd02d0d6c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.741 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.742 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.743 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '991a66eb-f4f5-4cc1-a082-150b8d1a808a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.742821', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47c25c14-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '1225717e075f30f0f88c07c027d169ebabe5b0701ff91835d53573863af9fe92'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.742821', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47c26678-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'c39c86a6e5c2fc482452b7bf35287eb41f287aa67eeddd8c6c7e6d30453f2abb'}]}, 'timestamp': '2025-11-24 14:31:35.743390', '_unique_id': 'b71a2630070848a180156f744a01058e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.744 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.745 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.745 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.745 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4ca9ae-b42f-47c8-9107-78cde5225640', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.745080', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47c2b394-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': 'e9cad9baa720db1ddd58cf6b285204ed704aeb2e560d4486f71efec6f1f19991'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.745080', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47c2bbf0-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': '0bb894fec5dcb212afc226c2905b30e0f9c604c57e6c624c0398456b1b9be465'}]}, 'timestamp': '2025-11-24 14:31:35.745551', '_unique_id': '2b4fef1249e14ec680d1c4327cba91da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.747 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.packets volume: 56 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.747 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9497f9d0-6cc1-45aa-99b2-fa8cc99cbee1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 56, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.747025', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47c30d4e-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '766cb9188bc4bf2e0cbb59304b7a28a409dc843aeb2e33ac12236e2cea492467'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.747025', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47c317b2-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'd18ff3ed8c801b445c67aa6d08d2327d47c69c13a381a0de0d9262c210771edb'}]}, 'timestamp': '2025-11-24 14:31:35.747892', '_unique_id': '028bc22eef284886b16610802ef1806a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.748 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.759 187122 DEBUG nova.virt.libvirt.guest [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-522280633</nova:name>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:31:35</nova:creationTime>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:port uuid="d569200b-51d5-4c6d-bc10-80fa732cc80e">
Nov 24 14:31:35 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     <nova:port uuid="9fc2ca85-0d0d-4a58-9141-850ad8736a28">
Nov 24 14:31:35 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 24 14:31:35 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:35 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:31:35 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:31:35 compute-0 nova_compute[187118]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.762 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.763 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ee71102-c331-4eb2-9727-e3defd5c02c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.749015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47c5716a-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.238536814, 'message_signature': '267f12f2da924b949629b94d3338567327253a8d8ff7171cbcba47550bfd2cd5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.749015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47c57b6a-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.238536814, 'message_signature': 'b1577e6cb71d8bb0ae028ad93af22bbfd1e3152fac602135b3c54734c2b1f6ec'}]}, 'timestamp': '2025-11-24 14:31:35.763552', '_unique_id': '7ffab1197b5848378144d1aa320b2c7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.764 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.765 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.read.requests volume: 1140 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.765 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b41b708-076b-477e-a4b7-627ab5c24c55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1140, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.765489', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47c5d394-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': 'f60692c9913d939064a76a37390b19db6b477604468921dbd6dbf29840235bfc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.765489', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47c5df42-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': 'e9a31a8efbc3503ae7e8fe77b2e40d880706d6d9de95e7b0cf95a469a0694394'}]}, 'timestamp': '2025-11-24 14:31:35.766135', '_unique_id': '368f97e2d3c84259be11a29dd2e1a4a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.766 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.767 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.767 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.767 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>]
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.768 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.777 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[cca563a8-af3a-4a9d-9122-653bcf6112de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.780 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[9c6089e8-33f8-47e6-a725-0b17ef1e1428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.786 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/memory.usage volume: 42.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a87e246-11cc-43f8-b096-a270850e5fca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.671875, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'timestamp': '2025-11-24T14:31:35.768230', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '47c90f8c-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.275548394, 'message_signature': 'dd2aa887f16d755996cff8a25aeab30aeffe3bb780e62d79bc49897d97ecdc49'}]}, 'timestamp': '2025-11-24 14:31:35.787100', '_unique_id': 'cbaa75bac6384e7498961914ba1927be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.788 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.789 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.789 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.packets volume: 58 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.789 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74f3955b-19a7-4238-8f98-adfb7d50bb98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 58, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.789480', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47c97c6a-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'feab06c49a90610064c41fb1a4ac81f1878c3e5adfafe67e6b8ac082e4141c68'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.789480', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47c98fc0-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'd3b179e5ebde3af9fd2c026bef42582db63f41decef3fa6a5846e2efa2316e65'}]}, 'timestamp': '2025-11-24 14:31:35.790336', '_unique_id': 'c46c74cbd0b240afac335997e1f6000b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.792 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.792 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-522280633>]
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.792 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.write.bytes volume: 72949760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.792 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21339574-4574-4c13-9915-9fe1b6abf81d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72949760, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.792553', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47c9f488-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': 'd457271d260d185f3c0e232dd6f088ed8eb05d989cccdbfa6bc3372f78707279'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.792553', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47ca004a-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': '5c0b5ec3fdcb9c2faae95eb2309330165437b6b80969da70425a896b21de16dc'}]}, 'timestamp': '2025-11-24 14:31:35.793194', '_unique_id': 'ff04afdae5644133998e4a23287c9b98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.793 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.794 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.794 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.bytes volume: 8266 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.795 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.793 187122 DEBUG oslo_concurrency.lockutils [None req-aeb8fbe1-24d7-4d22-aa90-5e91f959eed3 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-bbac6879-7cbf-4cf3-a37f-eefb9329007d-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 3.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70f264c-29f2-4af1-8b0a-08e2bec58b44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8266, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.794950', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47ca50b8-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'a0352ea7e9c7c2e4b6488e5171f13220737df7827816733b69e89c2e738fefe1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.794950', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47ca5c98-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '73ee36b5960127c3956b421135f239ca8ce7cfc12b11a3c862cd5c20bb6654ad'}]}, 'timestamp': '2025-11-24 14:31:35.795567', '_unique_id': '92625be4d2e8445c95e95611c24af13f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.796 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.797 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.797 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/cpu volume: 11610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '19843094-ac7d-45ea-a5ef-b8522792c7b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11610000000, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'timestamp': '2025-11-24T14:31:35.797392', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '47caaffe-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.275548394, 'message_signature': 'f2af52f6643ccb44947ecc8cfe37ad22388cb0e2c7975ce565f3227325ac239a'}]}, 'timestamp': '2025-11-24 14:31:35.797742', '_unique_id': '709e2affba8a4a4ea90f44dc4e9cc39f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.798 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.799 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.799 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.bytes volume: 10261 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.799 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1e763eb-9f2f-4fe6-b505-c8516de11fb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 10261, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.799463', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47cb0152-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '60236d2e0d7114eb87c7a8a36ded7d111f6e72d75b278b7747a6d0fe1a92b8d4'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.799463', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47cb0e72-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '26fc5c16b1650ceae1a2fb48504dd97fb7374d7a55ffa0e29071693974a621a1'}]}, 'timestamp': '2025-11-24 14:31:35.800120', '_unique_id': 'cccb5892b6334d07a0f99c09c3203944'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.800 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.802 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.read.bytes volume: 31037952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.802 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.8039] device (tap38bb9b4f-00): carrier: link connected
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '097031d0-ab90-48ea-b528-ddd752e68190', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31037952, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.802078', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47cb696c-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': 'b3bf5763274bd498eb0e287a44b0c224eda8c920061acd49716a8a9cdfc78833'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.802078', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47cb7628-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': '2b3ed9c856f2c93a067bce0ef42efa2221abe9b85bfdc892281aefdc92710b86'}]}, 'timestamp': '2025-11-24 14:31:35.802832', '_unique_id': '472fda376a2a4e42b78c5c39a4d6bf41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.803 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.804 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.804 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab6efb24-ec97-4b67-95ce-da8820fab9eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.804679', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47cbccf4-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.238536814, 'message_signature': '2d148a8c0c9034eca5f9cefc59038d47810f710afcc3641a339b6efd5a6f1c57'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.804679', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47cbd870-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.238536814, 'message_signature': 'a9271f66468a969278562d2814984de604eeef96ee5fb97f2186b705f5bed814'}]}, 'timestamp': '2025-11-24 14:31:35.805322', '_unique_id': 'd3eef143bed74fd599215c21fa3f1715'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.807 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.write.latency volume: 3493402289 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.807 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b1b630f-a046-47f7-affa-6ffc158cb8bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3493402289, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.807049', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47cc2942-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': 'ff4c6111455b7a49a22cded12a0613e72fa77ae52ba7cc41971123dbf4495af4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.807049', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47cc341e-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.194733461, 'message_signature': '4961609651f285e99bb5c0eaa73566e10cdd06d636adad2fafa0a03fe0e8f776'}]}, 'timestamp': '2025-11-24 14:31:35.807686', '_unique_id': '8b0fae2d7e004112955e760671232751'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.808 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.809 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.809 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.809 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '409778c7-2837-463a-af9d-c853099d12e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.809412', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47cc87ac-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '53080cfc5bcb7b0aa65cd0b55a0ff43c5db94719f82e5dd4b18c1261dddd7527'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.809412', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47cc944a-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'b7592c39eec073106364f9ade3f0be05e8d621ee86cb28b31d4fa85b342fae71'}]}, 'timestamp': '2025-11-24 14:31:35.810098', '_unique_id': '5c527e4c2b504533992f72772398ddb9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.810 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.811 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.811 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.812 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.811 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[bc603e63-c1fa-499a-b547-8b5051311f2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a92b685c-9c3c-4413-acc6-18b03aea1844', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.811772', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47cce1c0-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '88388d0de0e20e76ef5292ba7f6856fb4a7579764bc9250f20e07ef6823f9bc4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.811772', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47cced14-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '56c59a5b1462e9da466f7db70bc3e755bcb1dcb3f4eda2ca2c13d902c8b72afb'}]}, 'timestamp': '2025-11-24 14:31:35.812369', '_unique_id': '14c4b08cbb6641a48735d04dae452724'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.813 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.814 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.814 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab327628-d234-4398-97da-29d8830ac64a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tapd569200b-51', 'timestamp': '2025-11-24T14:31:35.814011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tapd569200b-51', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f2:4f:c4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapd569200b-51'}, 'message_id': '47cd3940-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': '3a0e089eb41c3ab28b56a44507a6d06034969cf4d3693fe88ba0f92ad0557d62'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-00000003-bbac6879-7cbf-4cf3-a37f-eefb9329007d-tap9fc2ca85-0d', 'timestamp': '2025-11-24T14:31:35.814011', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'tap9fc2ca85-0d', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f4:95:39', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9fc2ca85-0d'}, 'message_id': '47cd4818-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.167579894, 'message_signature': 'ff96539aedf2b387b5bce45ad15401e58300de8be0a2974b2fa0d953332e5632'}]}, 'timestamp': '2025-11-24 14:31:35.814778', '_unique_id': '1599292bf97a40779d2c10dff70ea4b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.815 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.816 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.816 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.816 12 DEBUG ceilometer.compute.pollsters [-] bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fde05318-129b-43e1-b3e7-d78d5bd2159a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-vda', 'timestamp': '2025-11-24T14:31:35.816469', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47cda092-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.238536814, 'message_signature': 'deeca8cd5869d46f9e1c575a601784e04483df1a90e5c6a2521673039019f014'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d-sda', 'timestamp': '2025-11-24T14:31:35.816469', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-522280633', 'name': 'instance-00000003', 'instance_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47cdad76-c942-11f0-9454-fa163e7ea22e', 'monotonic_time': 2963.238536814, 'message_signature': '9221c041e057e19ba9ee9d93cb88ba9757b476fef0e4da26caa6025945e0fe93'}]}, 'timestamp': '2025-11-24 14:31:35.817326', '_unique_id': 'f9f97e557c974c2cbd93aaf21a05bc71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:31:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:31:35.818 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.834 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[77efac22-af76-44e6-a940-9d3c51e0cecf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap38bb9b4f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:fb:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 296323, 'reachable_time': 35410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214610, 'error': None, 'target': 'ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.850 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0db7b679-4126-406a-9c28-0f5235bd99c7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:fbaf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 296323, 'tstamp': 296323}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214611, 'error': None, 'target': 'ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.868 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1a264302-3c1f-4ac4-a9db-cd30e9f6785d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap38bb9b4f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:fb:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 296323, 'reachable_time': 35410, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214612, 'error': None, 'target': 'ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.908 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0eef8309-c529-4ee5-b57b-ea3ae8f41913]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.969 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d8d3e8a3-803a-42b9-adc9-fd26f5bdf1a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.971 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38bb9b4f-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.971 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.971 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap38bb9b4f-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:35 compute-0 kernel: tap38bb9b4f-00: entered promiscuous mode
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.973 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 NetworkManager[55697]: <info>  [1763994695.9738] manager: (tap38bb9b4f-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.975 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.976 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap38bb9b4f-00, col_values=(('external_ids', {'iface-id': 'a6ec1de7-0ac2-4adb-9c75-89285f533c43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.977 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 ovn_controller[95613]: 2025-11-24T14:31:35Z|00057|binding|INFO|Releasing lport a6ec1de7-0ac2-4adb-9c75-89285f533c43 from this chassis (sb_readonly=0)
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.978 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.978 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/38bb9b4f-0a51-405a-8a7d-cd3764ab691d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/38bb9b4f-0a51-405a-8a7d-cd3764ab691d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.979 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[90086c7c-066f-41f8-aae3-7f4884d8998a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.979 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-38bb9b4f-0a51-405a-8a7d-cd3764ab691d
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/38bb9b4f-0a51-405a-8a7d-cd3764ab691d.pid.haproxy
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID 38bb9b4f-0a51-405a-8a7d-cd3764ab691d
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:31:35 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:35.980 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'env', 'PROCESS_TAG=haproxy-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/38bb9b4f-0a51-405a-8a7d-cd3764ab691d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:31:35 compute-0 nova_compute[187118]: 2025-11-24 14:31:35.988 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.005 187122 DEBUG nova.compute.manager [req-fee1e25f-5c9e-48a0-99b2-ba41e6fbecdb req-3df72e99-6b78-480b-b74c-377a540b4937 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.005 187122 DEBUG oslo_concurrency.lockutils [req-fee1e25f-5c9e-48a0-99b2-ba41e6fbecdb req-3df72e99-6b78-480b-b74c-377a540b4937 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.005 187122 DEBUG oslo_concurrency.lockutils [req-fee1e25f-5c9e-48a0-99b2-ba41e6fbecdb req-3df72e99-6b78-480b-b74c-377a540b4937 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.005 187122 DEBUG oslo_concurrency.lockutils [req-fee1e25f-5c9e-48a0-99b2-ba41e6fbecdb req-3df72e99-6b78-480b-b74c-377a540b4937 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.005 187122 DEBUG nova.compute.manager [req-fee1e25f-5c9e-48a0-99b2-ba41e6fbecdb req-3df72e99-6b78-480b-b74c-377a540b4937 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.005 187122 WARNING nova.compute.manager [req-fee1e25f-5c9e-48a0-99b2-ba41e6fbecdb req-3df72e99-6b78-480b-b74c-377a540b4937 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received unexpected event network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 for instance with vm_state active and task_state None.
Nov 24 14:31:36 compute-0 podman[214644]: 2025-11-24 14:31:36.360829206 +0000 UTC m=+0.044333819 container create 50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:31:36 compute-0 systemd[1]: Started libpod-conmon-50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10.scope.
Nov 24 14:31:36 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:31:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4cb4f04fb0d63bdb66264933a34c134937eb13f667eeff05eab391ebeaf45904/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:31:36 compute-0 podman[214644]: 2025-11-24 14:31:36.430800801 +0000 UTC m=+0.114305434 container init 50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 14:31:36 compute-0 podman[214644]: 2025-11-24 14:31:36.337856676 +0000 UTC m=+0.021361309 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:31:36 compute-0 podman[214644]: 2025-11-24 14:31:36.435809469 +0000 UTC m=+0.119314082 container start 50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:31:36 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [NOTICE]   (214664) : New worker (214666) forked
Nov 24 14:31:36 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [NOTICE]   (214664) : Loading success.
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.830 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:31:36 compute-0 nova_compute[187118]: 2025-11-24 14:31:36.831 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.010 187122 DEBUG nova.network.neutron [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updated VIF entry in instance network info cache for port 9fc2ca85-0d0d-4a58-9141-850ad8736a28. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.011 187122 DEBUG nova.network.neutron [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.024 187122 DEBUG oslo_concurrency.lockutils [req-7ac976db-7b2f-4152-b289-9b2d24325f6e req-65e983ce-a9c8-406c-b9da-8e4874df6a8d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:37 compute-0 ovn_controller[95613]: 2025-11-24T14:31:37Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:95:39 10.100.0.24
Nov 24 14:31:37 compute-0 ovn_controller[95613]: 2025-11-24T14:31:37Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:95:39 10.100.0.24
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.648 187122 DEBUG oslo_concurrency.lockutils [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "interface-bbac6879-7cbf-4cf3-a37f-eefb9329007d-9fc2ca85-0d0d-4a58-9141-850ad8736a28" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.648 187122 DEBUG oslo_concurrency.lockutils [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-bbac6879-7cbf-4cf3-a37f-eefb9329007d-9fc2ca85-0d0d-4a58-9141-850ad8736a28" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.664 187122 DEBUG nova.objects.instance [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'flavor' on Instance uuid bbac6879-7cbf-4cf3-a37f-eefb9329007d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.683 187122 DEBUG nova.virt.libvirt.vif [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:31:09Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.683 187122 DEBUG nova.network.os_vif_util [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.684 187122 DEBUG nova.network.os_vif_util [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.687 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:95:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc2ca85-0d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.690 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:95:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc2ca85-0d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.694 187122 DEBUG nova.virt.libvirt.driver [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Attempting to detach device tap9fc2ca85-0d from instance bbac6879-7cbf-4cf3-a37f-eefb9329007d from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.694 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] detach device xml: <interface type="ethernet">
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <mac address="fa:16:3e:f4:95:39"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <model type="virtio"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <mtu size="1442"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <target dev="tap9fc2ca85-0d"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </interface>
Nov 24 14:31:37 compute-0 nova_compute[187118]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.704 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:95:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc2ca85-0d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.709 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f4:95:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc2ca85-0d"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <name>instance-00000003</name>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <uuid>bbac6879-7cbf-4cf3-a37f-eefb9329007d</uuid>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-522280633</nova:name>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:31:35</nova:creationTime>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:port uuid="d569200b-51d5-4c6d-bc10-80fa732cc80e">
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:port uuid="9fc2ca85-0d0d-4a58-9141-850ad8736a28">
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <memory unit='KiB'>131072</memory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <vcpu placement='static'>1</vcpu>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <resource>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <partition>/machine</partition>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </resource>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <sysinfo type='smbios'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <system>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='manufacturer'>RDO</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='serial'>bbac6879-7cbf-4cf3-a37f-eefb9329007d</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='uuid'>bbac6879-7cbf-4cf3-a37f-eefb9329007d</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='family'>Virtual Machine</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </system>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <os>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <boot dev='hd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <smbios mode='sysinfo'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </os>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <features>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <vmcoreinfo state='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </features>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <vendor>AMD</vendor>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='x2apic'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='hypervisor'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='stibp'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='ssbd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='overflow-recov'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='succor'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='ibrs'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='lbrv'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='pause-filter'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='xsaves'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='svm'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='topoext'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='npt'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='nrip-save'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <clock offset='utc'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <timer name='hpet' present='no'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <on_poweroff>destroy</on_poweroff>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <on_reboot>restart</on_reboot>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <on_crash>destroy</on_crash>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <disk type='file' device='disk'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk' index='2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <backingStore type='file' index='3'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <format type='raw'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <source file='/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <backingStore/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       </backingStore>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='vda' bus='virtio'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='virtio-disk0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <disk type='file' device='cdrom'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.config' index='1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <backingStore/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='sda' bus='sata'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <readonly/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='sata0-0-0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pcie.0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='1' port='0x10'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='2' port='0x11'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='3' port='0x12'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='4' port='0x13'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='5' port='0x14'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='6' port='0x15'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='7' port='0x16'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='8' port='0x17'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.8'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='9' port='0x18'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.9'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='10' port='0x19'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.10'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='11' port='0x1a'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.11'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='12' port='0x1b'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.12'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='13' port='0x1c'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.13'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='14' port='0x1d'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.14'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='15' port='0x1e'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.15'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='16' port='0x1f'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.16'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='17' port='0x20'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.17'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='18' port='0x21'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.18'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='19' port='0x22'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.19'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='20' port='0x23'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.20'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='21' port='0x24'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.21'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='22' port='0x25'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.22'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='23' port='0x26'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.23'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='24' port='0x27'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.24'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='25' port='0x28'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.25'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-pci-bridge'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.26'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='usb'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='sata' index='0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='ide'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:f2:4f:c4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='tapd569200b-51'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='net0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:f4:95:39'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='tap9fc2ca85-0d'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='net1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <serial type='pty'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/console.log' append='off'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target type='isa-serial' port='0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <model name='isa-serial'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       </target>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/console.log' append='off'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target type='serial' port='0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </console>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <input type='tablet' bus='usb'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='input0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='usb' bus='0' port='1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </input>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <input type='mouse' bus='ps2'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='input1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </input>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <input type='keyboard' bus='ps2'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='input2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </input>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <listen type='address' address='::0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <audio id='1' type='none'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <video>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='video0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </video>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <watchdog model='itco' action='reset'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='watchdog0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </watchdog>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <memballoon model='virtio'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <stats period='10'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='balloon0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <rng model='virtio'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <backend model='random'>/dev/urandom</backend>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='rng0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <label>system_u:system_r:svirt_t:s0:c650,c995</label>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c650,c995</imagelabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <label>+107:+107</label>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <imagelabel>+107:+107</imagelabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </domain>
Nov 24 14:31:37 compute-0 nova_compute[187118]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.712 187122 INFO nova.virt.libvirt.driver [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully detached device tap9fc2ca85-0d from instance bbac6879-7cbf-4cf3-a37f-eefb9329007d from the persistent domain config.
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.712 187122 DEBUG nova.virt.libvirt.driver [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] (1/8): Attempting to detach device tap9fc2ca85-0d with device alias net1 from instance bbac6879-7cbf-4cf3-a37f-eefb9329007d from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.713 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] detach device xml: <interface type="ethernet">
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <mac address="fa:16:3e:f4:95:39"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <model type="virtio"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <mtu size="1442"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <target dev="tap9fc2ca85-0d"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </interface>
Nov 24 14:31:37 compute-0 nova_compute[187118]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.817 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.818 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.819 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:31:37 compute-0 kernel: tap9fc2ca85-0d (unregistering): left promiscuous mode
Nov 24 14:31:37 compute-0 NetworkManager[55697]: <info>  [1763994697.8280] device (tap9fc2ca85-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.832 187122 DEBUG nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Received event <DeviceRemovedEvent: 1763994697.8322043, bbac6879-7cbf-4cf3-a37f-eefb9329007d => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.834 187122 DEBUG nova.virt.libvirt.driver [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Start waiting for the detach event from libvirt for device tap9fc2ca85-0d with device alias net1 for instance bbac6879-7cbf-4cf3-a37f-eefb9329007d _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.835 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f4:95:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc2ca85-0d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.835 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:37 compute-0 ovn_controller[95613]: 2025-11-24T14:31:37Z|00058|binding|INFO|Releasing lport 9fc2ca85-0d0d-4a58-9141-850ad8736a28 from this chassis (sb_readonly=0)
Nov 24 14:31:37 compute-0 ovn_controller[95613]: 2025-11-24T14:31:37Z|00059|binding|INFO|Setting lport 9fc2ca85-0d0d-4a58-9141-850ad8736a28 down in Southbound
Nov 24 14:31:37 compute-0 ovn_controller[95613]: 2025-11-24T14:31:37Z|00060|binding|INFO|Removing iface tap9fc2ca85-0d ovn-installed in OVS
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.840 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.841 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f4:95:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap9fc2ca85-0d"/></interface>not found in domain: <domain type='kvm' id='3'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <name>instance-00000003</name>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <uuid>bbac6879-7cbf-4cf3-a37f-eefb9329007d</uuid>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-522280633</nova:name>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:31:35</nova:creationTime>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:port uuid="d569200b-51d5-4c6d-bc10-80fa732cc80e">
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:port uuid="9fc2ca85-0d0d-4a58-9141-850ad8736a28">
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <memory unit='KiB'>131072</memory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <vcpu placement='static'>1</vcpu>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <resource>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <partition>/machine</partition>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </resource>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <sysinfo type='smbios'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <system>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='manufacturer'>RDO</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='serial'>bbac6879-7cbf-4cf3-a37f-eefb9329007d</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='uuid'>bbac6879-7cbf-4cf3-a37f-eefb9329007d</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <entry name='family'>Virtual Machine</entry>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </system>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <os>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <boot dev='hd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <smbios mode='sysinfo'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </os>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <features>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <vmcoreinfo state='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </features>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <vendor>AMD</vendor>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='x2apic'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='hypervisor'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='stibp'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='ssbd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='overflow-recov'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='succor'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='ibrs'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='lbrv'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='pause-filter'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='xsaves'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='svm'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='require' name='topoext'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='npt'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <feature policy='disable' name='nrip-save'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <clock offset='utc'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <timer name='hpet' present='no'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <on_poweroff>destroy</on_poweroff>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <on_reboot>restart</on_reboot>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <on_crash>destroy</on_crash>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <disk type='file' device='disk'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk' index='2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <backingStore type='file' index='3'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <format type='raw'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <source file='/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <backingStore/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       </backingStore>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='vda' bus='virtio'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='virtio-disk0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <disk type='file' device='cdrom'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk.config' index='1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <backingStore/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='sda' bus='sata'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <readonly/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='sata0-0-0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pcie.0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='1' port='0x10'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='2' port='0x11'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='3' port='0x12'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='4' port='0x13'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='5' port='0x14'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='6' port='0x15'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='7' port='0x16'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='8' port='0x17'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.8'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='9' port='0x18'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.9'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='10' port='0x19'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.10'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='11' port='0x1a'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.11'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='12' port='0x1b'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.12'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='13' port='0x1c'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.13'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='14' port='0x1d'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.14'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='15' port='0x1e'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.15'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='16' port='0x1f'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.16'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='17' port='0x20'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.17'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='18' port='0x21'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.18'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='19' port='0x22'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.19'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='20' port='0x23'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.20'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='21' port='0x24'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.21'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='22' port='0x25'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.22'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='23' port='0x26'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.23'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='24' port='0x27'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.24'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target chassis='25' port='0x28'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.25'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model name='pcie-pci-bridge'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='pci.26'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='usb'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <controller type='sata' index='0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='ide'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:f2:4f:c4'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target dev='tapd569200b-51'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='net0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <serial type='pty'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/console.log' append='off'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target type='isa-serial' port='0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:         <model name='isa-serial'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       </target>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/console.log' append='off'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <target type='serial' port='0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </console>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <input type='tablet' bus='usb'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='input0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='usb' bus='0' port='1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </input>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <input type='mouse' bus='ps2'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='input1'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </input>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <input type='keyboard' bus='ps2'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='input2'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </input>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <listen type='address' address='::0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <audio id='1' type='none'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <video>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='video0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </video>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <watchdog model='itco' action='reset'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='watchdog0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </watchdog>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <memballoon model='virtio'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <stats period='10'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='balloon0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <rng model='virtio'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <backend model='random'>/dev/urandom</backend>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <alias name='rng0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <label>system_u:system_r:svirt_t:s0:c650,c995</label>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c650,c995</imagelabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <label>+107:+107</label>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <imagelabel>+107:+107</imagelabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </domain>
Nov 24 14:31:37 compute-0 nova_compute[187118]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.842 187122 INFO nova.virt.libvirt.driver [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully detached device tap9fc2ca85-0d from instance bbac6879-7cbf-4cf3-a37f-eefb9329007d from the live domain config.
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.843 187122 DEBUG nova.virt.libvirt.vif [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:31:09Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.843 187122 DEBUG nova.network.os_vif_util [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "address": "fa:16:3e:f4:95:39", "network": {"id": "38bb9b4f-0a51-405a-8a7d-cd3764ab691d", "bridge": "br-int", "label": "tempest-network-smoke--1969454033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9fc2ca85-0d", "ovs_interfaceid": "9fc2ca85-0d0d-4a58-9141-850ad8736a28", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.844 187122 DEBUG nova.network.os_vif_util [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.844 187122 DEBUG os_vif [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.846 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.846 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9fc2ca85-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:37 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:37.845 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:95:39 10.100.0.24'], port_security=['fa:16:3e:f4:95:39 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0f252b64-9a1f-4053-a32d-0ebf97a3916b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=9fc2ca85-0d0d-4a58-9141-850ad8736a28) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.847 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:37 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:37.848 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 9fc2ca85-0d0d-4a58-9141-850ad8736a28 in datapath 38bb9b4f-0a51-405a-8a7d-cd3764ab691d unbound from our chassis
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.849 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:31:37 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:37.850 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38bb9b4f-0a51-405a-8a7d-cd3764ab691d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:31:37 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:37.852 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[98d729be-cf2a-4300-aa73-8ab4e6bcf797]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:37 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:37.853 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d namespace which is not needed anymore
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.854 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.857 187122 INFO os_vif [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:95:39,bridge_name='br-int',has_traffic_filtering=True,id=9fc2ca85-0d0d-4a58-9141-850ad8736a28,network=Network(38bb9b4f-0a51-405a-8a7d-cd3764ab691d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9fc2ca85-0d')
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.857 187122 DEBUG nova.virt.libvirt.guest [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-522280633</nova:name>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:31:37</nova:creationTime>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     <nova:port uuid="d569200b-51d5-4c6d-bc10-80fa732cc80e">
Nov 24 14:31:37 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Nov 24 14:31:37 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:31:37 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:31:37 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:31:37 compute-0 nova_compute[187118]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 14:31:37 compute-0 nova_compute[187118]: 2025-11-24 14:31:37.918 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:37 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [NOTICE]   (214664) : haproxy version is 2.8.14-c23fe91
Nov 24 14:31:37 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [NOTICE]   (214664) : path to executable is /usr/sbin/haproxy
Nov 24 14:31:37 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [WARNING]  (214664) : Exiting Master process...
Nov 24 14:31:37 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [WARNING]  (214664) : Exiting Master process...
Nov 24 14:31:37 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [ALERT]    (214664) : Current worker (214666) exited with code 143 (Terminated)
Nov 24 14:31:37 compute-0 neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d[214660]: [WARNING]  (214664) : All workers exited. Exiting... (0)
Nov 24 14:31:37 compute-0 systemd[1]: libpod-50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10.scope: Deactivated successfully.
Nov 24 14:31:37 compute-0 podman[214695]: 2025-11-24 14:31:37.974500033 +0000 UTC m=+0.042690726 container died 50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 14:31:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10-userdata-shm.mount: Deactivated successfully.
Nov 24 14:31:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-4cb4f04fb0d63bdb66264933a34c134937eb13f667eeff05eab391ebeaf45904-merged.mount: Deactivated successfully.
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.006 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.007 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:31:38 compute-0 podman[214695]: 2025-11-24 14:31:38.00751622 +0000 UTC m=+0.075706903 container cleanup 50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 14:31:38 compute-0 systemd[1]: libpod-conmon-50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10.scope: Deactivated successfully.
Nov 24 14:31:38 compute-0 podman[214725]: 2025-11-24 14:31:38.063466146 +0000 UTC m=+0.036164549 container remove 50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.067 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4be3a0af-1a26-4ec0-859c-53efe8909f8d]: (4, ('Mon Nov 24 02:31:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d (50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10)\n50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10\nMon Nov 24 02:31:38 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d (50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10)\n50c3af35f649c6f0f160f5842ba0e3d17e922259556e55aa3a1fe0a8d4dfec10\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.068 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8de46f51-ba1d-4f97-b5e9-f26b4f9d0a3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.069 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38bb9b4f-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.070 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:38 compute-0 kernel: tap38bb9b4f-00: left promiscuous mode
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.073 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.082 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.085 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[83f1876f-9b17-42f0-a72a-d7930c0761f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.095 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e9d4b1-e1ff-493b-ad52-9c3d38167553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.095 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ba06f5c0-3933-4ccb-965a-5d9165af008a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.110 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[17511340-c2b0-4458-a431-90edf35705b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 296313, 'reachable_time': 44935, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214745, 'error': None, 'target': 'ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.112 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-38bb9b4f-0a51-405a-8a7d-cd3764ab691d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:31:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:38.112 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[7834ba73-c84f-46a2-b1bd-8f9f84592573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:38 compute-0 systemd[1]: run-netns-ovnmeta\x2d38bb9b4f\x2d0a51\x2d405a\x2d8a7d\x2dcd3764ab691d.mount: Deactivated successfully.
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.138 187122 DEBUG nova.compute.manager [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.138 187122 DEBUG oslo_concurrency.lockutils [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.138 187122 DEBUG oslo_concurrency.lockutils [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.138 187122 DEBUG oslo_concurrency.lockutils [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.138 187122 DEBUG nova.compute.manager [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.139 187122 WARNING nova.compute.manager [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received unexpected event network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 for instance with vm_state active and task_state None.
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.139 187122 DEBUG nova.compute.manager [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-unplugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.139 187122 DEBUG oslo_concurrency.lockutils [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.139 187122 DEBUG oslo_concurrency.lockutils [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.139 187122 DEBUG oslo_concurrency.lockutils [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.139 187122 DEBUG nova.compute.manager [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-unplugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.140 187122 WARNING nova.compute.manager [req-fc46d647-8494-47f1-8e34-5dbd29c6e693 req-9f179ef8-a3a8-4330-8605-ebf57edd96dd 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received unexpected event network-vif-unplugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 for instance with vm_state active and task_state None.
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.228 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.229 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5546MB free_disk=73.4340705871582GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.229 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.229 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.310 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance bbac6879-7cbf-4cf3-a37f-eefb9329007d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.311 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.311 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.356 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.369 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.390 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.390 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.431 187122 DEBUG oslo_concurrency.lockutils [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.431 187122 DEBUG oslo_concurrency.lockutils [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:38 compute-0 nova_compute[187118]: 2025-11-24 14:31:38.431 187122 DEBUG nova.network.neutron [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:31:39 compute-0 podman[214746]: 2025-11-24 14:31:39.477834469 +0000 UTC m=+0.076823572 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent)
Nov 24 14:31:39 compute-0 nova_compute[187118]: 2025-11-24 14:31:39.544 187122 INFO nova.network.neutron [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Port 9fc2ca85-0d0d-4a58-9141-850ad8736a28 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 24 14:31:39 compute-0 nova_compute[187118]: 2025-11-24 14:31:39.545 187122 DEBUG nova.network.neutron [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:39 compute-0 nova_compute[187118]: 2025-11-24 14:31:39.597 187122 DEBUG oslo_concurrency.lockutils [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:39 compute-0 nova_compute[187118]: 2025-11-24 14:31:39.636 187122 DEBUG oslo_concurrency.lockutils [None req-3d6ecf61-5563-4e2f-bf53-3da7c3cbe07b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-bbac6879-7cbf-4cf3-a37f-eefb9329007d-9fc2ca85-0d0d-4a58-9141-850ad8736a28" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 1.988s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:39 compute-0 ovn_controller[95613]: 2025-11-24T14:31:39Z|00061|binding|INFO|Releasing lport cfedc122-e1c6-453f-b0a9-daf44fc138ee from this chassis (sb_readonly=0)
Nov 24 14:31:39 compute-0 nova_compute[187118]: 2025-11-24 14:31:39.767 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.257 187122 DEBUG nova.compute.manager [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.257 187122 DEBUG oslo_concurrency.lockutils [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.258 187122 DEBUG oslo_concurrency.lockutils [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.258 187122 DEBUG oslo_concurrency.lockutils [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.258 187122 DEBUG nova.compute.manager [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.259 187122 WARNING nova.compute.manager [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received unexpected event network-vif-plugged-9fc2ca85-0d0d-4a58-9141-850ad8736a28 for instance with vm_state active and task_state None.
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.259 187122 DEBUG nova.compute.manager [req-5828f2a7-d37e-4fa8-87a6-ede74a95483b req-a3423390-5f09-4a69-998f-c776d080eaa6 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-deleted-9fc2ca85-0d0d-4a58-9141-850ad8736a28 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.385 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.386 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:40 compute-0 nova_compute[187118]: 2025-11-24 14:31:40.386 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:31:42 compute-0 nova_compute[187118]: 2025-11-24 14:31:42.849 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.346 187122 DEBUG nova.compute.manager [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-changed-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.346 187122 DEBUG nova.compute.manager [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing instance network info cache due to event network-changed-d569200b-51d5-4c6d-bc10-80fa732cc80e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.347 187122 DEBUG oslo_concurrency.lockutils [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.347 187122 DEBUG oslo_concurrency.lockutils [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.348 187122 DEBUG nova.network.neutron [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Refreshing network info cache for port d569200b-51d5-4c6d-bc10-80fa732cc80e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.444 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.445 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.446 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.446 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.447 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.449 187122 INFO nova.compute.manager [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Terminating instance
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.450 187122 DEBUG nova.compute.manager [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:31:43 compute-0 kernel: tapd569200b-51 (unregistering): left promiscuous mode
Nov 24 14:31:43 compute-0 NetworkManager[55697]: <info>  [1763994703.4780] device (tapd569200b-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:31:43 compute-0 ovn_controller[95613]: 2025-11-24T14:31:43Z|00062|binding|INFO|Releasing lport d569200b-51d5-4c6d-bc10-80fa732cc80e from this chassis (sb_readonly=0)
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.481 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 ovn_controller[95613]: 2025-11-24T14:31:43Z|00063|binding|INFO|Setting lport d569200b-51d5-4c6d-bc10-80fa732cc80e down in Southbound
Nov 24 14:31:43 compute-0 ovn_controller[95613]: 2025-11-24T14:31:43Z|00064|binding|INFO|Removing iface tapd569200b-51 ovn-installed in OVS
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.485 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.504 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:4f:c4 10.100.0.4'], port_security=['fa:16:3e:f2:4f:c4 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'bbac6879-7cbf-4cf3-a37f-eefb9329007d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-38ed537e-137f-4008-8b5e-205116f17c56', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b82a4a1e-b6e8-4ba4-a10d-838313a32f94', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12ae9689-a2d9-4eaf-8813-e59915c1ea74, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=d569200b-51d5-4c6d-bc10-80fa732cc80e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.505 104469 INFO neutron.agent.ovn.metadata.agent [-] Port d569200b-51d5-4c6d-bc10-80fa732cc80e in datapath 38ed537e-137f-4008-8b5e-205116f17c56 unbound from our chassis
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.507 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 38ed537e-137f-4008-8b5e-205116f17c56, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.508 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[24e90813-e858-4d80-804b-dd107b9909b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.508 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56 namespace which is not needed anymore
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.512 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 podman[214766]: 2025-11-24 14:31:43.514143185 +0000 UTC m=+0.100947911 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:31:43 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 24 14:31:43 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 14.129s CPU time.
Nov 24 14:31:43 compute-0 systemd-machined[153483]: Machine qemu-3-instance-00000003 terminated.
Nov 24 14:31:43 compute-0 podman[214765]: 2025-11-24 14:31:43.537906595 +0000 UTC m=+0.131109115 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 14:31:43 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [NOTICE]   (214422) : haproxy version is 2.8.14-c23fe91
Nov 24 14:31:43 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [NOTICE]   (214422) : path to executable is /usr/sbin/haproxy
Nov 24 14:31:43 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [WARNING]  (214422) : Exiting Master process...
Nov 24 14:31:43 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [ALERT]    (214422) : Current worker (214424) exited with code 143 (Terminated)
Nov 24 14:31:43 compute-0 neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56[214418]: [WARNING]  (214422) : All workers exited. Exiting... (0)
Nov 24 14:31:43 compute-0 systemd[1]: libpod-ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb.scope: Deactivated successfully.
Nov 24 14:31:43 compute-0 podman[214827]: 2025-11-24 14:31:43.643449322 +0000 UTC m=+0.042185283 container died ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 14:31:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb-userdata-shm.mount: Deactivated successfully.
Nov 24 14:31:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-221bc654e95982cda48f23ba2df3333c320af8c28a90fb4bddc723126d353e9d-merged.mount: Deactivated successfully.
Nov 24 14:31:43 compute-0 podman[214827]: 2025-11-24 14:31:43.676528851 +0000 UTC m=+0.075264812 container cleanup ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 24 14:31:43 compute-0 systemd[1]: libpod-conmon-ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb.scope: Deactivated successfully.
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.708 187122 DEBUG nova.compute.manager [req-5bc49b70-ec7b-4425-8d3c-49c80af895fd req-420bfce5-629e-432b-8e44-50b426e10c38 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-unplugged-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.709 187122 DEBUG oslo_concurrency.lockutils [req-5bc49b70-ec7b-4425-8d3c-49c80af895fd req-420bfce5-629e-432b-8e44-50b426e10c38 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.709 187122 DEBUG oslo_concurrency.lockutils [req-5bc49b70-ec7b-4425-8d3c-49c80af895fd req-420bfce5-629e-432b-8e44-50b426e10c38 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.709 187122 DEBUG oslo_concurrency.lockutils [req-5bc49b70-ec7b-4425-8d3c-49c80af895fd req-420bfce5-629e-432b-8e44-50b426e10c38 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.710 187122 DEBUG nova.compute.manager [req-5bc49b70-ec7b-4425-8d3c-49c80af895fd req-420bfce5-629e-432b-8e44-50b426e10c38 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-unplugged-d569200b-51d5-4c6d-bc10-80fa732cc80e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.710 187122 DEBUG nova.compute.manager [req-5bc49b70-ec7b-4425-8d3c-49c80af895fd req-420bfce5-629e-432b-8e44-50b426e10c38 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-unplugged-d569200b-51d5-4c6d-bc10-80fa732cc80e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.712 187122 INFO nova.virt.libvirt.driver [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Instance destroyed successfully.
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.712 187122 DEBUG nova.objects.instance [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid bbac6879-7cbf-4cf3-a37f-eefb9329007d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.722 187122 DEBUG nova.virt.libvirt.vif [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:31:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-522280633',display_name='tempest-TestNetworkBasicOps-server-522280633',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-522280633',id=3,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLw01kb6tWeXzKnP2tEe2ke1IavFelsvpxw8koC03IFB6nrIOVyNbEIXtvsg/IciT0a27l1r0BucZeBOqJNDOn2UAu/N6i/WcjjG4gY5bFMiKfis5pyBCkQaDdjTkkfkHw==',key_name='tempest-TestNetworkBasicOps-1329473827',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:31:09Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-w0j86mxt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:31:09Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=bbac6879-7cbf-4cf3-a37f-eefb9329007d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.723 187122 DEBUG nova.network.os_vif_util [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.723 187122 DEBUG nova.network.os_vif_util [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.724 187122 DEBUG os_vif [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.725 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.726 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd569200b-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.729 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.734 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:31:43 compute-0 podman[214867]: 2025-11-24 14:31:43.73573446 +0000 UTC m=+0.037838552 container remove ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.737 187122 INFO os_vif [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:4f:c4,bridge_name='br-int',has_traffic_filtering=True,id=d569200b-51d5-4c6d-bc10-80fa732cc80e,network=Network(38ed537e-137f-4008-8b5e-205116f17c56),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd569200b-51')
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.737 187122 INFO nova.virt.libvirt.driver [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Deleting instance files /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d_del
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.738 187122 INFO nova.virt.libvirt.driver [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Deletion of /var/lib/nova/instances/bbac6879-7cbf-4cf3-a37f-eefb9329007d_del complete
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.740 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5e526e-d6c1-470a-8940-860f6a2b9dcf]: (4, ('Mon Nov 24 02:31:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56 (ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb)\ned9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb\nMon Nov 24 02:31:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56 (ed9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb)\ned9f983714304f1a2300f197098548b8e27ad24a57ac7b2b26cb825b134375eb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.741 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0628aa70-7ba3-4bee-9f21-858d07cdb6df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.742 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap38ed537e-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.743 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 kernel: tap38ed537e-10: left promiscuous mode
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.765 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.768 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f0ec6784-c4e7-4d84-b970-9e42150b4a77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.782 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2babe9c5-77e6-41d6-8299-886940c0e2a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.783 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d8215fa1-04e5-4170-8df8-c226e1ce99a0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.788 187122 INFO nova.compute.manager [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.788 187122 DEBUG oslo.service.loopingcall [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.788 187122 DEBUG nova.compute.manager [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:31:43 compute-0 nova_compute[187118]: 2025-11-24 14:31:43.789 187122 DEBUG nova.network.neutron [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.798 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[eccf4a13-db31-42ef-982c-fab11478960b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 293609, 'reachable_time': 16483, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214891, 'error': None, 'target': 'ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.802 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-38ed537e-137f-4008-8b5e-205116f17c56 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:31:43 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:43.802 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0c31e947-b3e0-4e94-957b-31397d56fad7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:31:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d38ed537e\x2d137f\x2d4008\x2d8b5e\x2d205116f17c56.mount: Deactivated successfully.
Nov 24 14:31:44 compute-0 nova_compute[187118]: 2025-11-24 14:31:44.768 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.268 187122 DEBUG nova.network.neutron [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updated VIF entry in instance network info cache for port d569200b-51d5-4c6d-bc10-80fa732cc80e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.269 187122 DEBUG nova.network.neutron [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [{"id": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "address": "fa:16:3e:f2:4f:c4", "network": {"id": "38ed537e-137f-4008-8b5e-205116f17c56", "bridge": "br-int", "label": "tempest-network-smoke--380440806", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd569200b-51", "ovs_interfaceid": "d569200b-51d5-4c6d-bc10-80fa732cc80e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.296 187122 DEBUG oslo_concurrency.lockutils [req-c5e9c5ca-6008-42e6-87e5-0d53dfe87c07 req-7e05bcd7-9616-403c-9af3-e6e5ab16e3eb 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-bbac6879-7cbf-4cf3-a37f-eefb9329007d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.456 187122 DEBUG nova.network.neutron [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.477 187122 INFO nova.compute.manager [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Took 1.69 seconds to deallocate network for instance.
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.518 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.519 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.613 187122 DEBUG nova.compute.provider_tree [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.628 187122 DEBUG nova.scheduler.client.report [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.650 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.678 187122 INFO nova.scheduler.client.report [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance bbac6879-7cbf-4cf3-a37f-eefb9329007d
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.755 187122 DEBUG oslo_concurrency.lockutils [None req-c18d4063-5df8-45bc-b335-d792da2fa67e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.787 187122 DEBUG nova.compute.manager [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.787 187122 DEBUG oslo_concurrency.lockutils [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.789 187122 DEBUG oslo_concurrency.lockutils [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.790 187122 DEBUG oslo_concurrency.lockutils [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "bbac6879-7cbf-4cf3-a37f-eefb9329007d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.790 187122 DEBUG nova.compute.manager [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] No waiting events found dispatching network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.791 187122 WARNING nova.compute.manager [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received unexpected event network-vif-plugged-d569200b-51d5-4c6d-bc10-80fa732cc80e for instance with vm_state deleted and task_state None.
Nov 24 14:31:45 compute-0 nova_compute[187118]: 2025-11-24 14:31:45.791 187122 DEBUG nova.compute.manager [req-b45572b8-d209-4716-b3dc-21208077b447 req-a578c7e0-21cb-4f75-9ca8-009f77464f53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Received event network-vif-deleted-d569200b-51d5-4c6d-bc10-80fa732cc80e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:31:48 compute-0 nova_compute[187118]: 2025-11-24 14:31:48.728 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:48 compute-0 podman[214892]: 2025-11-24 14:31:48.811214475 +0000 UTC m=+0.055908344 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64)
Nov 24 14:31:49 compute-0 nova_compute[187118]: 2025-11-24 14:31:49.772 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:50 compute-0 nova_compute[187118]: 2025-11-24 14:31:50.276 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:50 compute-0 nova_compute[187118]: 2025-11-24 14:31:50.343 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:50 compute-0 podman[214913]: 2025-11-24 14:31:50.469491916 +0000 UTC m=+0.083667128 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:31:53 compute-0 nova_compute[187118]: 2025-11-24 14:31:53.731 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:54 compute-0 podman[214940]: 2025-11-24 14:31:54.476004358 +0000 UTC m=+0.075748444 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:31:54 compute-0 nova_compute[187118]: 2025-11-24 14:31:54.774 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:56.658 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:31:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:56.659 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:31:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:31:56.659 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:31:58 compute-0 nova_compute[187118]: 2025-11-24 14:31:58.709 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994703.705936, bbac6879-7cbf-4cf3-a37f-eefb9329007d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:31:58 compute-0 nova_compute[187118]: 2025-11-24 14:31:58.709 187122 INFO nova.compute.manager [-] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] VM Stopped (Lifecycle Event)
Nov 24 14:31:58 compute-0 nova_compute[187118]: 2025-11-24 14:31:58.731 187122 DEBUG nova.compute.manager [None req-07b618d8-bcc6-4be8-a472-38af4ddf15ca - - - - - -] [instance: bbac6879-7cbf-4cf3-a37f-eefb9329007d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:31:58 compute-0 nova_compute[187118]: 2025-11-24 14:31:58.735 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:31:59 compute-0 nova_compute[187118]: 2025-11-24 14:31:59.776 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:03 compute-0 nova_compute[187118]: 2025-11-24 14:32:03.738 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:04 compute-0 nova_compute[187118]: 2025-11-24 14:32:04.779 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.347 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.348 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.379 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.454 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.455 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.464 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.465 187122 INFO nova.compute.claims [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:32:06 compute-0 podman[214963]: 2025-11-24 14:32:06.478685991 +0000 UTC m=+0.075485058 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.570 187122 DEBUG nova.compute.provider_tree [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.586 187122 DEBUG nova.scheduler.client.report [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.616 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.617 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.664 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.665 187122 DEBUG nova.network.neutron [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.686 187122 INFO nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.706 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.808 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.809 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.810 187122 INFO nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Creating image(s)
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.811 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.811 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.812 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.831 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.885 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.886 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.887 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.902 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.956 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.958 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.994 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.995 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:06 compute-0 nova_compute[187118]: 2025-11-24 14:32:06.995 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.050 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.052 187122 DEBUG nova.virt.disk.api [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.052 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.118 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.119 187122 DEBUG nova.virt.disk.api [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.119 187122 DEBUG nova.objects.instance [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 239be499-c936-4d64-a260-7b5702d8709e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.137 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.138 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Ensure instance console log exists: /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.138 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.138 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.139 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:07 compute-0 nova_compute[187118]: 2025-11-24 14:32:07.262 187122 DEBUG nova.policy [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.010 187122 DEBUG nova.network.neutron [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Successfully created port: d3287295-b9fe-4bf9-bf10-567417593602 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.554 187122 DEBUG nova.network.neutron [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Successfully updated port: d3287295-b9fe-4bf9-bf10-567417593602 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.588 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.588 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.589 187122 DEBUG nova.network.neutron [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.667 187122 DEBUG nova.compute.manager [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-changed-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.667 187122 DEBUG nova.compute.manager [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Refreshing instance network info cache due to event network-changed-d3287295-b9fe-4bf9-bf10-567417593602. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.668 187122 DEBUG oslo_concurrency.lockutils [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.742 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:08 compute-0 nova_compute[187118]: 2025-11-24 14:32:08.751 187122 DEBUG nova.network.neutron [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.566 187122 DEBUG nova.network.neutron [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updating instance_info_cache with network_info: [{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.583 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.583 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Instance network_info: |[{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.584 187122 DEBUG oslo_concurrency.lockutils [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.584 187122 DEBUG nova.network.neutron [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Refreshing network info cache for port d3287295-b9fe-4bf9-bf10-567417593602 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.586 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Start _get_guest_xml network_info=[{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.590 187122 WARNING nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.594 187122 DEBUG nova.virt.libvirt.host [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.594 187122 DEBUG nova.virt.libvirt.host [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.600 187122 DEBUG nova.virt.libvirt.host [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.600 187122 DEBUG nova.virt.libvirt.host [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.601 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.601 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.601 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.602 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.602 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.602 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.602 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.602 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.603 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.603 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.603 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.603 187122 DEBUG nova.virt.hardware [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.606 187122 DEBUG nova.virt.libvirt.vif [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:32:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-469240546',display_name='tempest-TestNetworkBasicOps-server-469240546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-469240546',id=4,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKxe74ZcgTYujYW20bBm/F2VsORUD4tAq4N+l8Q3k4S31rtcRGawvuSKeYcLd3qb0oCLPQVxECH8WAslJ4/Gv/sMGAO54E5uUvsc98LyelGw3wULG0uLrRcuBw3seWweYQ==',key_name='tempest-TestNetworkBasicOps-1370680654',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-2hn5qdal',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:32:06Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=239be499-c936-4d64-a260-7b5702d8709e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.607 187122 DEBUG nova.network.os_vif_util [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.607 187122 DEBUG nova.network.os_vif_util [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.608 187122 DEBUG nova.objects.instance [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 239be499-c936-4d64-a260-7b5702d8709e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.617 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <uuid>239be499-c936-4d64-a260-7b5702d8709e</uuid>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <name>instance-00000004</name>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-469240546</nova:name>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:32:09</nova:creationTime>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         <nova:port uuid="d3287295-b9fe-4bf9-bf10-567417593602">
Nov 24 14:32:09 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <system>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <entry name="serial">239be499-c936-4d64-a260-7b5702d8709e</entry>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <entry name="uuid">239be499-c936-4d64-a260-7b5702d8709e</entry>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </system>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <os>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </os>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <features>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </features>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.config"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:41:c1:9a"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <target dev="tapd3287295-b9"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/console.log" append="off"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <video>
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </video>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:32:09 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:32:09 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:32:09 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:32:09 compute-0 nova_compute[187118]: </domain>
Nov 24 14:32:09 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.619 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Preparing to wait for external event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.619 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.619 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.620 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.620 187122 DEBUG nova.virt.libvirt.vif [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:32:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-469240546',display_name='tempest-TestNetworkBasicOps-server-469240546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-469240546',id=4,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKxe74ZcgTYujYW20bBm/F2VsORUD4tAq4N+l8Q3k4S31rtcRGawvuSKeYcLd3qb0oCLPQVxECH8WAslJ4/Gv/sMGAO54E5uUvsc98LyelGw3wULG0uLrRcuBw3seWweYQ==',key_name='tempest-TestNetworkBasicOps-1370680654',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-2hn5qdal',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:32:06Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=239be499-c936-4d64-a260-7b5702d8709e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.621 187122 DEBUG nova.network.os_vif_util [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.622 187122 DEBUG nova.network.os_vif_util [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.622 187122 DEBUG os_vif [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.623 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.623 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.624 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.627 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.627 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd3287295-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.627 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd3287295-b9, col_values=(('external_ids', {'iface-id': 'd3287295-b9fe-4bf9-bf10-567417593602', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:41:c1:9a', 'vm-uuid': '239be499-c936-4d64-a260-7b5702d8709e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.629 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:09 compute-0 NetworkManager[55697]: <info>  [1763994729.6299] manager: (tapd3287295-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.632 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.634 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.635 187122 INFO os_vif [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9')
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.683 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.684 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.684 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:41:c1:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.684 187122 INFO nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Using config drive
Nov 24 14:32:09 compute-0 podman[215005]: 2025-11-24 14:32:09.720684893 +0000 UTC m=+0.052662966 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.780 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.964 187122 INFO nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Creating config drive at /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.config
Nov 24 14:32:09 compute-0 nova_compute[187118]: 2025-11-24 14:32:09.970 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzscs5ack execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.095 187122 DEBUG oslo_concurrency.processutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzscs5ack" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:10 compute-0 kernel: tapd3287295-b9: entered promiscuous mode
Nov 24 14:32:10 compute-0 NetworkManager[55697]: <info>  [1763994730.1638] manager: (tapd3287295-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Nov 24 14:32:10 compute-0 ovn_controller[95613]: 2025-11-24T14:32:10Z|00065|binding|INFO|Claiming lport d3287295-b9fe-4bf9-bf10-567417593602 for this chassis.
Nov 24 14:32:10 compute-0 ovn_controller[95613]: 2025-11-24T14:32:10Z|00066|binding|INFO|d3287295-b9fe-4bf9-bf10-567417593602: Claiming fa:16:3e:41:c1:9a 10.100.0.14
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.164 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.173 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.176 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.195 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:c1:9a 10.100.0.14'], port_security=['fa:16:3e:41:c1:9a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '239be499-c936-4d64-a260-7b5702d8709e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a93cb7d-4e07-42bc-bcf1-b5647ae1be26', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d43fe65-8fe5-4c92-8b2d-ca15b1b3e2af, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=d3287295-b9fe-4bf9-bf10-567417593602) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.197 104469 INFO neutron.agent.ovn.metadata.agent [-] Port d3287295-b9fe-4bf9-bf10-567417593602 in datapath ec4c56dd-0181-49fa-aa54-bd6c0e4050bc bound to our chassis
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.197 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec4c56dd-0181-49fa-aa54-bd6c0e4050bc
Nov 24 14:32:10 compute-0 systemd-machined[153483]: New machine qemu-4-instance-00000004.
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.209 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[de4fdbe3-c9e1-4bc5-bcc7-1f896f6737e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.210 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapec4c56dd-01 in ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.213 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapec4c56dd-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.213 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c96d38c0-62e0-4cb9-9500-fea4b673459e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.214 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[439fa623-1e36-4339-ab6e-c4ae3e9463fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.225 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c937d7fe-5bd9-4c05-b618-400f50173e4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_controller[95613]: 2025-11-24T14:32:10Z|00067|binding|INFO|Setting lport d3287295-b9fe-4bf9-bf10-567417593602 ovn-installed in OVS
Nov 24 14:32:10 compute-0 ovn_controller[95613]: 2025-11-24T14:32:10Z|00068|binding|INFO|Setting lport d3287295-b9fe-4bf9-bf10-567417593602 up in Southbound
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.234 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.243 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8675ab5c-2a3c-493f-9f3d-2f2e42f79932]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Nov 24 14:32:10 compute-0 systemd-udevd[215046]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:32:10 compute-0 NetworkManager[55697]: <info>  [1763994730.2715] device (tapd3287295-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:32:10 compute-0 NetworkManager[55697]: <info>  [1763994730.2727] device (tapd3287295-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.274 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[d2edf079-566e-41b0-bb50-5f5d53aa70b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.280 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[dc92b04b-c71b-4150-a360-9f2927e5a6bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 NetworkManager[55697]: <info>  [1763994730.2823] manager: (tapec4c56dd-00): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.319 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[c54d75cf-2c2a-4a9e-8fe9-eda4b2c39ac8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.322 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[a88b2aa5-2eb2-4ca4-bbb3-cf4038c67d5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 NetworkManager[55697]: <info>  [1763994730.3466] device (tapec4c56dd-00): carrier: link connected
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.354 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[3bdf1fe5-5d5a-4474-bb05-9fe4cf6ce1cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.370 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[89aa796f-9cec-4d78-b7e4-228823980503]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec4c56dd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:18:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299777, 'reachable_time': 21366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215076, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.390 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5ce1a9-4ef6-44cf-9935-4b8e060c832d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:18ea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 299777, 'tstamp': 299777}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215077, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.406 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b636ef16-4775-471b-835e-8c82b8445192]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec4c56dd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:18:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299777, 'reachable_time': 21366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215078, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.438 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b11e748e-74a2-43aa-8127-fa355c2197b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.490 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[edfb5b98-4e50-454d-a4ce-ae9239c7993c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.492 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec4c56dd-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.493 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.493 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec4c56dd-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.495 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 kernel: tapec4c56dd-00: entered promiscuous mode
Nov 24 14:32:10 compute-0 NetworkManager[55697]: <info>  [1763994730.4959] manager: (tapec4c56dd-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.501 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec4c56dd-00, col_values=(('external_ids', {'iface-id': 'fbf12914-b379-48b4-acb2-a36d9312e540'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:10 compute-0 ovn_controller[95613]: 2025-11-24T14:32:10Z|00069|binding|INFO|Releasing lport fbf12914-b379-48b4-acb2-a36d9312e540 from this chassis (sb_readonly=0)
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.504 187122 DEBUG nova.compute.manager [req-5bd173e3-de94-43c2-85ce-030bb2c8db75 req-c8b9af1b-00ec-4e08-845d-63c2f579850c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.505 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ec4c56dd-0181-49fa-aa54-bd6c0e4050bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ec4c56dd-0181-49fa-aa54-bd6c0e4050bc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.505 187122 DEBUG oslo_concurrency.lockutils [req-5bd173e3-de94-43c2-85ce-030bb2c8db75 req-c8b9af1b-00ec-4e08-845d-63c2f579850c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.505 187122 DEBUG oslo_concurrency.lockutils [req-5bd173e3-de94-43c2-85ce-030bb2c8db75 req-c8b9af1b-00ec-4e08-845d-63c2f579850c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.506 187122 DEBUG oslo_concurrency.lockutils [req-5bd173e3-de94-43c2-85ce-030bb2c8db75 req-c8b9af1b-00ec-4e08-845d-63c2f579850c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.506 187122 DEBUG nova.compute.manager [req-5bd173e3-de94-43c2-85ce-030bb2c8db75 req-c8b9af1b-00ec-4e08-845d-63c2f579850c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Processing event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.510 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.510 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3009def8-e151-4169-8e73-262ae18f4bd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.511 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/ec4c56dd-0181-49fa-aa54-bd6c0e4050bc.pid.haproxy
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID ec4c56dd-0181-49fa-aa54-bd6c0e4050bc
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:32:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:10.512 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'env', 'PROCESS_TAG=haproxy-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ec4c56dd-0181-49fa-aa54-bd6c0e4050bc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.513 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.627 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994730.627138, 239be499-c936-4d64-a260-7b5702d8709e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.628 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] VM Started (Lifecycle Event)
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.631 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.634 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.637 187122 INFO nova.virt.libvirt.driver [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Instance spawned successfully.
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.637 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.647 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.653 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.656 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.656 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.657 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.657 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.658 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.659 187122 DEBUG nova.virt.libvirt.driver [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.686 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.687 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994730.6272328, 239be499-c936-4d64-a260-7b5702d8709e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.687 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] VM Paused (Lifecycle Event)
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.716 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.719 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994730.6336775, 239be499-c936-4d64-a260-7b5702d8709e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.719 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] VM Resumed (Lifecycle Event)
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.736 187122 INFO nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Took 3.93 seconds to spawn the instance on the hypervisor.
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.736 187122 DEBUG nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.739 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.745 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.780 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.814 187122 INFO nova.compute.manager [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Took 4.39 seconds to build instance.
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.820 187122 DEBUG nova.network.neutron [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updated VIF entry in instance network info cache for port d3287295-b9fe-4bf9-bf10-567417593602. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.820 187122 DEBUG nova.network.neutron [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updating instance_info_cache with network_info: [{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.829 187122 DEBUG oslo_concurrency.lockutils [None req-0022416e-3c2d-44b8-9cdb-af2429bc0fc0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.482s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:10 compute-0 nova_compute[187118]: 2025-11-24 14:32:10.831 187122 DEBUG oslo_concurrency.lockutils [req-1b05532b-a178-4abb-b395-d035e6c066ad req-f43b4d29-3962-4117-80fa-507d6213bcb9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:10 compute-0 podman[215115]: 2025-11-24 14:32:10.898560568 +0000 UTC m=+0.067431875 container create 6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:32:10 compute-0 systemd[1]: Started libpod-conmon-6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb.scope.
Nov 24 14:32:10 compute-0 podman[215115]: 2025-11-24 14:32:10.851731581 +0000 UTC m=+0.020602908 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:32:10 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:32:10 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe8b8d5daccecdfdf9ed5329357073d4794e7359299b710c0fce88333f3bf20a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:32:10 compute-0 podman[215115]: 2025-11-24 14:32:10.977530614 +0000 UTC m=+0.146401911 container init 6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 24 14:32:10 compute-0 podman[215115]: 2025-11-24 14:32:10.982701334 +0000 UTC m=+0.151572631 container start 6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:32:11 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [NOTICE]   (215134) : New worker (215136) forked
Nov 24 14:32:11 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [NOTICE]   (215134) : Loading success.
Nov 24 14:32:12 compute-0 nova_compute[187118]: 2025-11-24 14:32:12.591 187122 DEBUG nova.compute.manager [req-2654229c-ae56-40b9-ab1c-93af5f2f8af9 req-844bce43-00ed-4882-be10-99a8e756c319 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:12 compute-0 nova_compute[187118]: 2025-11-24 14:32:12.592 187122 DEBUG oslo_concurrency.lockutils [req-2654229c-ae56-40b9-ab1c-93af5f2f8af9 req-844bce43-00ed-4882-be10-99a8e756c319 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:12 compute-0 nova_compute[187118]: 2025-11-24 14:32:12.593 187122 DEBUG oslo_concurrency.lockutils [req-2654229c-ae56-40b9-ab1c-93af5f2f8af9 req-844bce43-00ed-4882-be10-99a8e756c319 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:12 compute-0 nova_compute[187118]: 2025-11-24 14:32:12.593 187122 DEBUG oslo_concurrency.lockutils [req-2654229c-ae56-40b9-ab1c-93af5f2f8af9 req-844bce43-00ed-4882-be10-99a8e756c319 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:12 compute-0 nova_compute[187118]: 2025-11-24 14:32:12.593 187122 DEBUG nova.compute.manager [req-2654229c-ae56-40b9-ab1c-93af5f2f8af9 req-844bce43-00ed-4882-be10-99a8e756c319 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] No waiting events found dispatching network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:32:12 compute-0 nova_compute[187118]: 2025-11-24 14:32:12.594 187122 WARNING nova.compute.manager [req-2654229c-ae56-40b9-ab1c-93af5f2f8af9 req-844bce43-00ed-4882-be10-99a8e756c319 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received unexpected event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 for instance with vm_state active and task_state None.
Nov 24 14:32:14 compute-0 podman[215146]: 2025-11-24 14:32:14.459418248 +0000 UTC m=+0.059006347 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2)
Nov 24 14:32:14 compute-0 podman[215145]: 2025-11-24 14:32:14.493840489 +0000 UTC m=+0.088909186 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:32:14 compute-0 nova_compute[187118]: 2025-11-24 14:32:14.630 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:14 compute-0 nova_compute[187118]: 2025-11-24 14:32:14.785 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:15 compute-0 NetworkManager[55697]: <info>  [1763994735.2286] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 24 14:32:15 compute-0 NetworkManager[55697]: <info>  [1763994735.2299] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.228 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:15 compute-0 ovn_controller[95613]: 2025-11-24T14:32:15Z|00070|binding|INFO|Releasing lport fbf12914-b379-48b4-acb2-a36d9312e540 from this chassis (sb_readonly=0)
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.257 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:15 compute-0 ovn_controller[95613]: 2025-11-24T14:32:15Z|00071|binding|INFO|Releasing lport fbf12914-b379-48b4-acb2-a36d9312e540 from this chassis (sb_readonly=0)
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.262 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.487 187122 DEBUG nova.compute.manager [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-changed-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.488 187122 DEBUG nova.compute.manager [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Refreshing instance network info cache due to event network-changed-d3287295-b9fe-4bf9-bf10-567417593602. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.489 187122 DEBUG oslo_concurrency.lockutils [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.489 187122 DEBUG oslo_concurrency.lockutils [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:15 compute-0 nova_compute[187118]: 2025-11-24 14:32:15.490 187122 DEBUG nova.network.neutron [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Refreshing network info cache for port d3287295-b9fe-4bf9-bf10-567417593602 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:32:16 compute-0 nova_compute[187118]: 2025-11-24 14:32:16.661 187122 DEBUG nova.network.neutron [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updated VIF entry in instance network info cache for port d3287295-b9fe-4bf9-bf10-567417593602. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:32:16 compute-0 nova_compute[187118]: 2025-11-24 14:32:16.662 187122 DEBUG nova.network.neutron [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updating instance_info_cache with network_info: [{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:16 compute-0 nova_compute[187118]: 2025-11-24 14:32:16.680 187122 DEBUG oslo_concurrency.lockutils [req-4977d471-a95b-4336-9ba0-c5e25fd18120 req-76ef0602-1a13-45d7-9151-fb1de99b0c03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:19 compute-0 podman[215184]: 2025-11-24 14:32:19.489052573 +0000 UTC m=+0.087551169 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, version=9.6, io.openshift.tags=minimal rhel9)
Nov 24 14:32:19 compute-0 nova_compute[187118]: 2025-11-24 14:32:19.632 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:19 compute-0 nova_compute[187118]: 2025-11-24 14:32:19.787 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:21 compute-0 podman[215204]: 2025-11-24 14:32:21.502682427 +0000 UTC m=+0.105946817 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:32:24 compute-0 ovn_controller[95613]: 2025-11-24T14:32:24Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:41:c1:9a 10.100.0.14
Nov 24 14:32:24 compute-0 ovn_controller[95613]: 2025-11-24T14:32:24Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:41:c1:9a 10.100.0.14
Nov 24 14:32:24 compute-0 nova_compute[187118]: 2025-11-24 14:32:24.633 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:24 compute-0 nova_compute[187118]: 2025-11-24 14:32:24.788 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:25 compute-0 podman[215252]: 2025-11-24 14:32:25.443447645 +0000 UTC m=+0.054039214 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:32:29 compute-0 nova_compute[187118]: 2025-11-24 14:32:29.635 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:29 compute-0 nova_compute[187118]: 2025-11-24 14:32:29.791 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:30 compute-0 nova_compute[187118]: 2025-11-24 14:32:30.025 187122 INFO nova.compute.manager [None req-656ee982-f7e6-4aec-b457-85451c8bd80a ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Get console output
Nov 24 14:32:30 compute-0 nova_compute[187118]: 2025-11-24 14:32:30.033 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:32:31 compute-0 nova_compute[187118]: 2025-11-24 14:32:31.731 187122 DEBUG nova.compute.manager [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-changed-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:31 compute-0 nova_compute[187118]: 2025-11-24 14:32:31.731 187122 DEBUG nova.compute.manager [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Refreshing instance network info cache due to event network-changed-d3287295-b9fe-4bf9-bf10-567417593602. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:32:31 compute-0 nova_compute[187118]: 2025-11-24 14:32:31.732 187122 DEBUG oslo_concurrency.lockutils [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:31 compute-0 nova_compute[187118]: 2025-11-24 14:32:31.732 187122 DEBUG oslo_concurrency.lockutils [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:31 compute-0 nova_compute[187118]: 2025-11-24 14:32:31.732 187122 DEBUG nova.network.neutron [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Refreshing network info cache for port d3287295-b9fe-4bf9-bf10-567417593602 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:32:32 compute-0 nova_compute[187118]: 2025-11-24 14:32:32.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:33 compute-0 nova_compute[187118]: 2025-11-24 14:32:33.074 187122 DEBUG nova.network.neutron [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updated VIF entry in instance network info cache for port d3287295-b9fe-4bf9-bf10-567417593602. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:32:33 compute-0 nova_compute[187118]: 2025-11-24 14:32:33.075 187122 DEBUG nova.network.neutron [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updating instance_info_cache with network_info: [{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:33 compute-0 nova_compute[187118]: 2025-11-24 14:32:33.093 187122 DEBUG oslo_concurrency.lockutils [req-d1d9da92-a693-4ff9-b1a4-5c6bd7651425 req-2fc4b1ee-548a-428a-bad5-d7972a95a8af 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:33 compute-0 nova_compute[187118]: 2025-11-24 14:32:33.136 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:33 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:33.136 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:32:33 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:33.139 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:32:34 compute-0 nova_compute[187118]: 2025-11-24 14:32:34.638 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:34 compute-0 nova_compute[187118]: 2025-11-24 14:32:34.794 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:35 compute-0 nova_compute[187118]: 2025-11-24 14:32:35.790 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:35 compute-0 nova_compute[187118]: 2025-11-24 14:32:35.808 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:36 compute-0 nova_compute[187118]: 2025-11-24 14:32:36.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:37 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:37.140 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:37 compute-0 podman[215276]: 2025-11-24 14:32:37.437914795 +0000 UTC m=+0.049016766 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.981 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.982 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquired lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.982 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 14:32:37 compute-0 nova_compute[187118]: 2025-11-24 14:32:37.982 187122 DEBUG nova.objects.instance [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 239be499-c936-4d64-a260-7b5702d8709e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:32:39 compute-0 nova_compute[187118]: 2025-11-24 14:32:39.640 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:39 compute-0 nova_compute[187118]: 2025-11-24 14:32:39.797 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:40 compute-0 podman[215300]: 2025-11-24 14:32:40.466163607 +0000 UTC m=+0.077821526 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.728 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updating instance_info_cache with network_info: [{"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.745 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Releasing lock "refresh_cache-239be499-c936-4d64-a260-7b5702d8709e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.745 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.746 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.747 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.747 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.767 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.767 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.768 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.768 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.866 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.954 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:40 compute-0 nova_compute[187118]: 2025-11-24 14:32:40.955 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.019 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.163 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.164 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5632MB free_disk=73.43027114868164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.164 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.164 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.382 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 239be499-c936-4d64-a260-7b5702d8709e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.383 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.383 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.414 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.425 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.441 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.441 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.490 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.491 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:41 compute-0 nova_compute[187118]: 2025-11-24 14:32:41.491 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:32:44 compute-0 nova_compute[187118]: 2025-11-24 14:32:44.642 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:44 compute-0 nova_compute[187118]: 2025-11-24 14:32:44.798 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.112 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.113 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.124 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.187 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.187 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.195 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.195 187122 INFO nova.compute.claims [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.316 187122 DEBUG nova.compute.provider_tree [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.335 187122 DEBUG nova.scheduler.client.report [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.363 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.364 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.403 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.404 187122 DEBUG nova.network.neutron [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.424 187122 INFO nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.443 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:32:45 compute-0 podman[215327]: 2025-11-24 14:32:45.461377481 +0000 UTC m=+0.069175562 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:32:45 compute-0 podman[215328]: 2025-11-24 14:32:45.49459199 +0000 UTC m=+0.083746987 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.546 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.548 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.549 187122 INFO nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Creating image(s)
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.549 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.550 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.551 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.573 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.624 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.625 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.626 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.642 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.696 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.698 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.730 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.732 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.733 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.790 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.791 187122 DEBUG nova.virt.disk.api [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.792 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.850 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.851 187122 DEBUG nova.virt.disk.api [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.852 187122 DEBUG nova.objects.instance [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 9699fb56-9697-4926-9b5a-b883c523bbfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.865 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.866 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Ensure instance console log exists: /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.866 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.867 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:45 compute-0 nova_compute[187118]: 2025-11-24 14:32:45.867 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:46 compute-0 nova_compute[187118]: 2025-11-24 14:32:46.155 187122 DEBUG nova.policy [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:32:47 compute-0 nova_compute[187118]: 2025-11-24 14:32:47.972 187122 DEBUG nova.network.neutron [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Successfully created port: 31085152-4721-424a-87cc-2bc13b46b42a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.205 187122 DEBUG nova.network.neutron [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Successfully updated port: 31085152-4721-424a-87cc-2bc13b46b42a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.220 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.221 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.221 187122 DEBUG nova.network.neutron [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.320 187122 DEBUG nova.compute.manager [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-changed-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.321 187122 DEBUG nova.compute.manager [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Refreshing instance network info cache due to event network-changed-31085152-4721-424a-87cc-2bc13b46b42a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.322 187122 DEBUG oslo_concurrency.lockutils [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.374 187122 DEBUG nova.network.neutron [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.645 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:49 compute-0 nova_compute[187118]: 2025-11-24 14:32:49.800 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:50 compute-0 podman[215380]: 2025-11-24 14:32:50.476486113 +0000 UTC m=+0.081455755 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41)
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.292 187122 DEBUG nova.network.neutron [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Updating instance_info_cache with network_info: [{"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.313 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.313 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Instance network_info: |[{"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.313 187122 DEBUG oslo_concurrency.lockutils [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.314 187122 DEBUG nova.network.neutron [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Refreshing network info cache for port 31085152-4721-424a-87cc-2bc13b46b42a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.316 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Start _get_guest_xml network_info=[{"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.320 187122 WARNING nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.325 187122 DEBUG nova.virt.libvirt.host [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.325 187122 DEBUG nova.virt.libvirt.host [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.334 187122 DEBUG nova.virt.libvirt.host [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.335 187122 DEBUG nova.virt.libvirt.host [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.335 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.336 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.337 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.337 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.338 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.338 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.339 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.339 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.340 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.340 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.341 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.341 187122 DEBUG nova.virt.hardware [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.349 187122 DEBUG nova.virt.libvirt.vif [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:32:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-251251812',display_name='tempest-TestNetworkBasicOps-server-251251812',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-251251812',id=5,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEu3eKuuw0iT6qIuFNgJKZY06F2aescoXxFdYri8Qitae4JbkW+dvlLUmw121A4GqboK6+BuUDkoe97asQM8DCCTCEiCg37W8ZCw6v6vnrukhngpOY2PekCG0UoNp7awvg==',key_name='tempest-TestNetworkBasicOps-69304412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-xsf9bmm6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:32:45Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9699fb56-9697-4926-9b5a-b883c523bbfb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.350 187122 DEBUG nova.network.os_vif_util [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.351 187122 DEBUG nova.network.os_vif_util [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.353 187122 DEBUG nova.objects.instance [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9699fb56-9697-4926-9b5a-b883c523bbfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.374 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <uuid>9699fb56-9697-4926-9b5a-b883c523bbfb</uuid>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <name>instance-00000005</name>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-251251812</nova:name>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:32:51</nova:creationTime>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         <nova:port uuid="31085152-4721-424a-87cc-2bc13b46b42a">
Nov 24 14:32:51 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <system>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <entry name="serial">9699fb56-9697-4926-9b5a-b883c523bbfb</entry>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <entry name="uuid">9699fb56-9697-4926-9b5a-b883c523bbfb</entry>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </system>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <os>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </os>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <features>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </features>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.config"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:a6:bd:b9"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <target dev="tap31085152-47"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/console.log" append="off"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <video>
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </video>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:32:51 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:32:51 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:32:51 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:32:51 compute-0 nova_compute[187118]: </domain>
Nov 24 14:32:51 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.374 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Preparing to wait for external event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.374 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.374 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.375 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.375 187122 DEBUG nova.virt.libvirt.vif [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:32:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-251251812',display_name='tempest-TestNetworkBasicOps-server-251251812',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-251251812',id=5,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEu3eKuuw0iT6qIuFNgJKZY06F2aescoXxFdYri8Qitae4JbkW+dvlLUmw121A4GqboK6+BuUDkoe97asQM8DCCTCEiCg37W8ZCw6v6vnrukhngpOY2PekCG0UoNp7awvg==',key_name='tempest-TestNetworkBasicOps-69304412',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-xsf9bmm6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:32:45Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9699fb56-9697-4926-9b5a-b883c523bbfb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.375 187122 DEBUG nova.network.os_vif_util [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.376 187122 DEBUG nova.network.os_vif_util [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.376 187122 DEBUG os_vif [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.377 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.377 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.378 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.381 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.381 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap31085152-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.382 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap31085152-47, col_values=(('external_ids', {'iface-id': '31085152-4721-424a-87cc-2bc13b46b42a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a6:bd:b9', 'vm-uuid': '9699fb56-9697-4926-9b5a-b883c523bbfb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.383 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 NetworkManager[55697]: <info>  [1763994771.3847] manager: (tap31085152-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.386 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.390 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.391 187122 INFO os_vif [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47')
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.439 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.439 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.439 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:a6:bd:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.440 187122 INFO nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Using config drive
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.720 187122 INFO nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Creating config drive at /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.config
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.725 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1og44wh4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.862 187122 DEBUG oslo_concurrency.processutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1og44wh4" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:32:51 compute-0 kernel: tap31085152-47: entered promiscuous mode
Nov 24 14:32:51 compute-0 NetworkManager[55697]: <info>  [1763994771.9485] manager: (tap31085152-47): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Nov 24 14:32:51 compute-0 ovn_controller[95613]: 2025-11-24T14:32:51Z|00072|binding|INFO|Claiming lport 31085152-4721-424a-87cc-2bc13b46b42a for this chassis.
Nov 24 14:32:51 compute-0 ovn_controller[95613]: 2025-11-24T14:32:51Z|00073|binding|INFO|31085152-4721-424a-87cc-2bc13b46b42a: Claiming fa:16:3e:a6:bd:b9 10.100.0.6
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.952 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:51.958 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:bd:b9 10.100.0.6'], port_security=['fa:16:3e:a6:bd:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9699fb56-9697-4926-9b5a-b883c523bbfb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ae75fb1a-b306-43f8-8244-584633a00e2d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d43fe65-8fe5-4c92-8b2d-ca15b1b3e2af, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=31085152-4721-424a-87cc-2bc13b46b42a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:32:51 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:51.960 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 31085152-4721-424a-87cc-2bc13b46b42a in datapath ec4c56dd-0181-49fa-aa54-bd6c0e4050bc bound to our chassis
Nov 24 14:32:51 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:51.963 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec4c56dd-0181-49fa-aa54-bd6c0e4050bc
Nov 24 14:32:51 compute-0 ovn_controller[95613]: 2025-11-24T14:32:51Z|00074|binding|INFO|Setting lport 31085152-4721-424a-87cc-2bc13b46b42a up in Southbound
Nov 24 14:32:51 compute-0 ovn_controller[95613]: 2025-11-24T14:32:51Z|00075|binding|INFO|Setting lport 31085152-4721-424a-87cc-2bc13b46b42a ovn-installed in OVS
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.977 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.978 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 nova_compute[187118]: 2025-11-24 14:32:51.982 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:51 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:51.991 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a907ab73-f72d-4b15-9b5e-e1517c555859]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:51 compute-0 systemd-udevd[215434]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:32:52 compute-0 NetworkManager[55697]: <info>  [1763994772.0090] device (tap31085152-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:32:52 compute-0 NetworkManager[55697]: <info>  [1763994772.0103] device (tap31085152-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:32:52 compute-0 systemd-machined[153483]: New machine qemu-5-instance-00000005.
Nov 24 14:32:52 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.022 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[71ced7fb-c4ae-47ba-8acc-63b28d6ee187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.026 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[f19597fe-35b0-4094-9bd1-8718c0837786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:52 compute-0 podman[215414]: 2025-11-24 14:32:52.051420299 +0000 UTC m=+0.112336750 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.054 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[890faf92-415b-4901-990f-445c273707ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.074 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0920ae-441d-4f3d-9e96-d7264c865a49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec4c56dd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:18:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299777, 'reachable_time': 21366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215454, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.092 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2f75aafe-f96c-400e-af0c-e6c3283d5056]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec4c56dd-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 299789, 'tstamp': 299789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215460, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec4c56dd-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 299791, 'tstamp': 299791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215460, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.093 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec4c56dd-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.095 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.096 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.097 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec4c56dd-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.097 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.097 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec4c56dd-00, col_values=(('external_ids', {'iface-id': 'fbf12914-b379-48b4-acb2-a36d9312e540'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:32:52 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:52.097 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.326 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994772.326022, 9699fb56-9697-4926-9b5a-b883c523bbfb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.327 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] VM Started (Lifecycle Event)
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.350 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.355 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994772.3261445, 9699fb56-9697-4926-9b5a-b883c523bbfb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.356 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] VM Paused (Lifecycle Event)
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.376 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.380 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.397 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.440 187122 DEBUG nova.compute.manager [req-75aee4fd-c375-438f-aba4-678f8d1cf5d4 req-84d03f5d-e146-4c1c-bed0-d21d97997dc4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.441 187122 DEBUG oslo_concurrency.lockutils [req-75aee4fd-c375-438f-aba4-678f8d1cf5d4 req-84d03f5d-e146-4c1c-bed0-d21d97997dc4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.442 187122 DEBUG oslo_concurrency.lockutils [req-75aee4fd-c375-438f-aba4-678f8d1cf5d4 req-84d03f5d-e146-4c1c-bed0-d21d97997dc4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.442 187122 DEBUG oslo_concurrency.lockutils [req-75aee4fd-c375-438f-aba4-678f8d1cf5d4 req-84d03f5d-e146-4c1c-bed0-d21d97997dc4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.442 187122 DEBUG nova.compute.manager [req-75aee4fd-c375-438f-aba4-678f8d1cf5d4 req-84d03f5d-e146-4c1c-bed0-d21d97997dc4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Processing event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.443 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.447 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994772.446442, 9699fb56-9697-4926-9b5a-b883c523bbfb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.447 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] VM Resumed (Lifecycle Event)
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.450 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.454 187122 INFO nova.virt.libvirt.driver [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Instance spawned successfully.
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.454 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.470 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.474 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.491 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.491 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.492 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.493 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.493 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.494 187122 DEBUG nova.virt.libvirt.driver [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.500 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.565 187122 INFO nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Took 7.02 seconds to spawn the instance on the hypervisor.
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.566 187122 DEBUG nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.639 187122 INFO nova.compute.manager [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Took 7.47 seconds to build instance.
Nov 24 14:32:52 compute-0 nova_compute[187118]: 2025-11-24 14:32:52.654 187122 DEBUG oslo_concurrency.lockutils [None req-085fea7d-69cd-4477-9ebf-574701ab2350 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.541s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:53 compute-0 nova_compute[187118]: 2025-11-24 14:32:53.445 187122 DEBUG nova.network.neutron [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Updated VIF entry in instance network info cache for port 31085152-4721-424a-87cc-2bc13b46b42a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:32:53 compute-0 nova_compute[187118]: 2025-11-24 14:32:53.445 187122 DEBUG nova.network.neutron [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Updating instance_info_cache with network_info: [{"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:53 compute-0 nova_compute[187118]: 2025-11-24 14:32:53.462 187122 DEBUG oslo_concurrency.lockutils [req-05173c3b-9b28-4f99-97f5-013152178a98 req-5cfa0e9f-6fe3-4837-b9c9-904c3b7219a0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.506 187122 DEBUG nova.compute.manager [req-d2d6d1c5-5eac-4007-a583-ddf67583d60f req-25a21e6f-c708-403e-8654-b5f4195a842a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.507 187122 DEBUG oslo_concurrency.lockutils [req-d2d6d1c5-5eac-4007-a583-ddf67583d60f req-25a21e6f-c708-403e-8654-b5f4195a842a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.507 187122 DEBUG oslo_concurrency.lockutils [req-d2d6d1c5-5eac-4007-a583-ddf67583d60f req-25a21e6f-c708-403e-8654-b5f4195a842a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.508 187122 DEBUG oslo_concurrency.lockutils [req-d2d6d1c5-5eac-4007-a583-ddf67583d60f req-25a21e6f-c708-403e-8654-b5f4195a842a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.508 187122 DEBUG nova.compute.manager [req-d2d6d1c5-5eac-4007-a583-ddf67583d60f req-25a21e6f-c708-403e-8654-b5f4195a842a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] No waiting events found dispatching network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.508 187122 WARNING nova.compute.manager [req-d2d6d1c5-5eac-4007-a583-ddf67583d60f req-25a21e6f-c708-403e-8654-b5f4195a842a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received unexpected event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a for instance with vm_state active and task_state None.
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.803 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.834 187122 DEBUG nova.compute.manager [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-changed-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.835 187122 DEBUG nova.compute.manager [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Refreshing instance network info cache due to event network-changed-31085152-4721-424a-87cc-2bc13b46b42a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.835 187122 DEBUG oslo_concurrency.lockutils [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.835 187122 DEBUG oslo_concurrency.lockutils [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:32:54 compute-0 nova_compute[187118]: 2025-11-24 14:32:54.836 187122 DEBUG nova.network.neutron [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Refreshing network info cache for port 31085152-4721-424a-87cc-2bc13b46b42a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:32:56 compute-0 nova_compute[187118]: 2025-11-24 14:32:56.189 187122 DEBUG nova.network.neutron [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Updated VIF entry in instance network info cache for port 31085152-4721-424a-87cc-2bc13b46b42a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:32:56 compute-0 nova_compute[187118]: 2025-11-24 14:32:56.189 187122 DEBUG nova.network.neutron [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Updating instance_info_cache with network_info: [{"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:32:56 compute-0 nova_compute[187118]: 2025-11-24 14:32:56.207 187122 DEBUG oslo_concurrency.lockutils [req-809060c1-be8a-43de-83eb-fdb184933420 req-46d94583-9560-4c00-9f6d-3909d858c79c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-9699fb56-9697-4926-9b5a-b883c523bbfb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:32:56 compute-0 nova_compute[187118]: 2025-11-24 14:32:56.385 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:32:56 compute-0 podman[215469]: 2025-11-24 14:32:56.484874345 +0000 UTC m=+0.089437801 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:32:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:56.659 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:32:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:56.659 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:32:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:32:56.660 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:32:59 compute-0 nova_compute[187118]: 2025-11-24 14:32:59.804 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:01 compute-0 nova_compute[187118]: 2025-11-24 14:33:01.388 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:04 compute-0 ovn_controller[95613]: 2025-11-24T14:33:04Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a6:bd:b9 10.100.0.6
Nov 24 14:33:04 compute-0 ovn_controller[95613]: 2025-11-24T14:33:04Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a6:bd:b9 10.100.0.6
Nov 24 14:33:04 compute-0 nova_compute[187118]: 2025-11-24 14:33:04.806 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:06 compute-0 nova_compute[187118]: 2025-11-24 14:33:06.391 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:08 compute-0 podman[215503]: 2025-11-24 14:33:08.449089977 +0000 UTC m=+0.053948780 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:33:09 compute-0 nova_compute[187118]: 2025-11-24 14:33:09.809 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.289 187122 INFO nova.compute.manager [None req-2417aaeb-3d60-411c-8e72-2506a8611c79 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Get console output
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.294 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.394 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 podman[215527]: 2025-11-24 14:33:11.462992831 +0000 UTC m=+0.065201674 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.584 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.584 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.585 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.585 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.585 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.587 187122 INFO nova.compute.manager [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Terminating instance
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.588 187122 DEBUG nova.compute.manager [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:33:11 compute-0 kernel: tap31085152-47 (unregistering): left promiscuous mode
Nov 24 14:33:11 compute-0 NetworkManager[55697]: <info>  [1763994791.6200] device (tap31085152-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:33:11 compute-0 ovn_controller[95613]: 2025-11-24T14:33:11Z|00076|binding|INFO|Releasing lport 31085152-4721-424a-87cc-2bc13b46b42a from this chassis (sb_readonly=0)
Nov 24 14:33:11 compute-0 ovn_controller[95613]: 2025-11-24T14:33:11Z|00077|binding|INFO|Setting lport 31085152-4721-424a-87cc-2bc13b46b42a down in Southbound
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.633 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 ovn_controller[95613]: 2025-11-24T14:33:11Z|00078|binding|INFO|Removing iface tap31085152-47 ovn-installed in OVS
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.636 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.651 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a6:bd:b9 10.100.0.6'], port_security=['fa:16:3e:a6:bd:b9 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9699fb56-9697-4926-9b5a-b883c523bbfb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ae75fb1a-b306-43f8-8244-584633a00e2d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d43fe65-8fe5-4c92-8b2d-ca15b1b3e2af, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=31085152-4721-424a-87cc-2bc13b46b42a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.652 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 31085152-4721-424a-87cc-2bc13b46b42a in datapath ec4c56dd-0181-49fa-aa54-bd6c0e4050bc unbound from our chassis
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.653 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ec4c56dd-0181-49fa-aa54-bd6c0e4050bc
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.663 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.677 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[005ead6d-13d8-44ed-bd71-49dbef0cc86f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:11 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Nov 24 14:33:11 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.680s CPU time.
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.705 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[49cd6d38-41c0-4bae-bccb-7e640e8a4122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:11 compute-0 systemd-machined[153483]: Machine qemu-5-instance-00000005 terminated.
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.709 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[c0cb6adb-2e5c-474c-9254-83623ca11fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.737 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[da474d54-1416-4d71-b0f3-f848c41d8152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.759 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[95a3b2a1-9a4c-479b-a288-5065ae7cfb5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapec4c56dd-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:18:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299777, 'reachable_time': 21366, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215557, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.777 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0f187d-9593-40dd-ae53-de42348cdd08]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapec4c56dd-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 299789, 'tstamp': 299789}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215558, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapec4c56dd-01'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 299791, 'tstamp': 299791}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215558, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.779 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec4c56dd-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.780 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.786 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.786 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapec4c56dd-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.787 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.787 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapec4c56dd-00, col_values=(('external_ids', {'iface-id': 'fbf12914-b379-48b4-acb2-a36d9312e540'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:11 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:11.787 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.853 187122 INFO nova.virt.libvirt.driver [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Instance destroyed successfully.
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.853 187122 DEBUG nova.objects.instance [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 9699fb56-9697-4926-9b5a-b883c523bbfb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.868 187122 DEBUG nova.virt.libvirt.vif [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:32:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-251251812',display_name='tempest-TestNetworkBasicOps-server-251251812',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-251251812',id=5,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEu3eKuuw0iT6qIuFNgJKZY06F2aescoXxFdYri8Qitae4JbkW+dvlLUmw121A4GqboK6+BuUDkoe97asQM8DCCTCEiCg37W8ZCw6v6vnrukhngpOY2PekCG0UoNp7awvg==',key_name='tempest-TestNetworkBasicOps-69304412',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:32:52Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-xsf9bmm6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:32:52Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9699fb56-9697-4926-9b5a-b883c523bbfb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.869 187122 DEBUG nova.network.os_vif_util [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "31085152-4721-424a-87cc-2bc13b46b42a", "address": "fa:16:3e:a6:bd:b9", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap31085152-47", "ovs_interfaceid": "31085152-4721-424a-87cc-2bc13b46b42a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.870 187122 DEBUG nova.network.os_vif_util [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.871 187122 DEBUG os_vif [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.875 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.875 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap31085152-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.878 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.881 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.883 187122 INFO os_vif [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a6:bd:b9,bridge_name='br-int',has_traffic_filtering=True,id=31085152-4721-424a-87cc-2bc13b46b42a,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap31085152-47')
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.884 187122 INFO nova.virt.libvirt.driver [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Deleting instance files /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb_del
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.885 187122 INFO nova.virt.libvirt.driver [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Deletion of /var/lib/nova/instances/9699fb56-9697-4926-9b5a-b883c523bbfb_del complete
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.933 187122 INFO nova.compute.manager [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.934 187122 DEBUG oslo.service.loopingcall [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.934 187122 DEBUG nova.compute.manager [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:33:11 compute-0 nova_compute[187118]: 2025-11-24 14:33:11.934 187122 DEBUG nova.network.neutron [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.477 187122 DEBUG nova.compute.manager [req-f89986c4-e5cc-4006-a416-8e4ca93a194f req-b3cb4720-9225-4629-9f1e-c864e0832193 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-vif-unplugged-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.478 187122 DEBUG oslo_concurrency.lockutils [req-f89986c4-e5cc-4006-a416-8e4ca93a194f req-b3cb4720-9225-4629-9f1e-c864e0832193 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.478 187122 DEBUG oslo_concurrency.lockutils [req-f89986c4-e5cc-4006-a416-8e4ca93a194f req-b3cb4720-9225-4629-9f1e-c864e0832193 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.478 187122 DEBUG oslo_concurrency.lockutils [req-f89986c4-e5cc-4006-a416-8e4ca93a194f req-b3cb4720-9225-4629-9f1e-c864e0832193 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.479 187122 DEBUG nova.compute.manager [req-f89986c4-e5cc-4006-a416-8e4ca93a194f req-b3cb4720-9225-4629-9f1e-c864e0832193 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] No waiting events found dispatching network-vif-unplugged-31085152-4721-424a-87cc-2bc13b46b42a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.479 187122 DEBUG nova.compute.manager [req-f89986c4-e5cc-4006-a416-8e4ca93a194f req-b3cb4720-9225-4629-9f1e-c864e0832193 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-vif-unplugged-31085152-4721-424a-87cc-2bc13b46b42a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.877 187122 DEBUG nova.network.neutron [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.897 187122 INFO nova.compute.manager [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Took 0.96 seconds to deallocate network for instance.
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.940 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.940 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:12 compute-0 nova_compute[187118]: 2025-11-24 14:33:12.942 187122 DEBUG nova.compute.manager [req-95c9ed82-db92-46d8-9a45-302c0965c4c1 req-a63f8b93-a2b4-4fe4-855b-71da493465f4 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-vif-deleted-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:13 compute-0 nova_compute[187118]: 2025-11-24 14:33:13.005 187122 DEBUG nova.compute.provider_tree [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:33:13 compute-0 nova_compute[187118]: 2025-11-24 14:33:13.017 187122 DEBUG nova.scheduler.client.report [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:33:13 compute-0 nova_compute[187118]: 2025-11-24 14:33:13.033 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:13 compute-0 nova_compute[187118]: 2025-11-24 14:33:13.055 187122 INFO nova.scheduler.client.report [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 9699fb56-9697-4926-9b5a-b883c523bbfb
Nov 24 14:33:13 compute-0 nova_compute[187118]: 2025-11-24 14:33:13.106 187122 DEBUG oslo_concurrency.lockutils [None req-da0c4103-b3a0-490f-9cc2-2f2aa02f7d28 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.522s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.557 187122 DEBUG nova.compute.manager [req-5f0b3793-1e2b-4c2e-9730-e365cd608fd6 req-12eaad39-8520-4df0-937e-c483de4c3a9a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.558 187122 DEBUG oslo_concurrency.lockutils [req-5f0b3793-1e2b-4c2e-9730-e365cd608fd6 req-12eaad39-8520-4df0-937e-c483de4c3a9a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.558 187122 DEBUG oslo_concurrency.lockutils [req-5f0b3793-1e2b-4c2e-9730-e365cd608fd6 req-12eaad39-8520-4df0-937e-c483de4c3a9a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.559 187122 DEBUG oslo_concurrency.lockutils [req-5f0b3793-1e2b-4c2e-9730-e365cd608fd6 req-12eaad39-8520-4df0-937e-c483de4c3a9a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9699fb56-9697-4926-9b5a-b883c523bbfb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.559 187122 DEBUG nova.compute.manager [req-5f0b3793-1e2b-4c2e-9730-e365cd608fd6 req-12eaad39-8520-4df0-937e-c483de4c3a9a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] No waiting events found dispatching network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.559 187122 WARNING nova.compute.manager [req-5f0b3793-1e2b-4c2e-9730-e365cd608fd6 req-12eaad39-8520-4df0-937e-c483de4c3a9a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Received unexpected event network-vif-plugged-31085152-4721-424a-87cc-2bc13b46b42a for instance with vm_state deleted and task_state None.
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.811 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.842 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.843 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.844 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.845 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.845 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.847 187122 INFO nova.compute.manager [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Terminating instance
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.849 187122 DEBUG nova.compute.manager [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:33:14 compute-0 kernel: tapd3287295-b9 (unregistering): left promiscuous mode
Nov 24 14:33:14 compute-0 NetworkManager[55697]: <info>  [1763994794.8740] device (tapd3287295-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:33:14 compute-0 ovn_controller[95613]: 2025-11-24T14:33:14Z|00079|binding|INFO|Releasing lport d3287295-b9fe-4bf9-bf10-567417593602 from this chassis (sb_readonly=0)
Nov 24 14:33:14 compute-0 ovn_controller[95613]: 2025-11-24T14:33:14Z|00080|binding|INFO|Setting lport d3287295-b9fe-4bf9-bf10-567417593602 down in Southbound
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.879 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:14 compute-0 ovn_controller[95613]: 2025-11-24T14:33:14Z|00081|binding|INFO|Removing iface tapd3287295-b9 ovn-installed in OVS
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.884 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:14.888 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:c1:9a 10.100.0.14'], port_security=['fa:16:3e:41:c1:9a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '239be499-c936-4d64-a260-7b5702d8709e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a93cb7d-4e07-42bc-bcf1-b5647ae1be26', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8d43fe65-8fe5-4c92-8b2d-ca15b1b3e2af, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=d3287295-b9fe-4bf9-bf10-567417593602) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:33:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:14.889 104469 INFO neutron.agent.ovn.metadata.agent [-] Port d3287295-b9fe-4bf9-bf10-567417593602 in datapath ec4c56dd-0181-49fa-aa54-bd6c0e4050bc unbound from our chassis
Nov 24 14:33:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:14.889 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:33:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:14.890 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3f5919e5-092d-4d7e-9a3e-b94fdd69caac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:14.891 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc namespace which is not needed anymore
Nov 24 14:33:14 compute-0 nova_compute[187118]: 2025-11-24 14:33:14.896 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:14 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Nov 24 14:33:14 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 15.380s CPU time.
Nov 24 14:33:14 compute-0 systemd-machined[153483]: Machine qemu-4-instance-00000004 terminated.
Nov 24 14:33:15 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [NOTICE]   (215134) : haproxy version is 2.8.14-c23fe91
Nov 24 14:33:15 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [NOTICE]   (215134) : path to executable is /usr/sbin/haproxy
Nov 24 14:33:15 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [WARNING]  (215134) : Exiting Master process...
Nov 24 14:33:15 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [WARNING]  (215134) : Exiting Master process...
Nov 24 14:33:15 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [ALERT]    (215134) : Current worker (215136) exited with code 143 (Terminated)
Nov 24 14:33:15 compute-0 neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc[215130]: [WARNING]  (215134) : All workers exited. Exiting... (0)
Nov 24 14:33:15 compute-0 systemd[1]: libpod-6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb.scope: Deactivated successfully.
Nov 24 14:33:15 compute-0 podman[215601]: 2025-11-24 14:33:15.044038406 +0000 UTC m=+0.048813461 container died 6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 14:33:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-fe8b8d5daccecdfdf9ed5329357073d4794e7359299b710c0fce88333f3bf20a-merged.mount: Deactivated successfully.
Nov 24 14:33:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb-userdata-shm.mount: Deactivated successfully.
Nov 24 14:33:15 compute-0 podman[215601]: 2025-11-24 14:33:15.0870535 +0000 UTC m=+0.091828555 container cleanup 6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:33:15 compute-0 systemd[1]: libpod-conmon-6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb.scope: Deactivated successfully.
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.118 187122 INFO nova.virt.libvirt.driver [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Instance destroyed successfully.
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.119 187122 DEBUG nova.objects.instance [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 239be499-c936-4d64-a260-7b5702d8709e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.136 187122 DEBUG nova.virt.libvirt.vif [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:32:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-469240546',display_name='tempest-TestNetworkBasicOps-server-469240546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-469240546',id=4,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKxe74ZcgTYujYW20bBm/F2VsORUD4tAq4N+l8Q3k4S31rtcRGawvuSKeYcLd3qb0oCLPQVxECH8WAslJ4/Gv/sMGAO54E5uUvsc98LyelGw3wULG0uLrRcuBw3seWweYQ==',key_name='tempest-TestNetworkBasicOps-1370680654',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:32:10Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-2hn5qdal',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:32:10Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=239be499-c936-4d64-a260-7b5702d8709e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.137 187122 DEBUG nova.network.os_vif_util [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "d3287295-b9fe-4bf9-bf10-567417593602", "address": "fa:16:3e:41:c1:9a", "network": {"id": "ec4c56dd-0181-49fa-aa54-bd6c0e4050bc", "bridge": "br-int", "label": "tempest-network-smoke--1903470332", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd3287295-b9", "ovs_interfaceid": "d3287295-b9fe-4bf9-bf10-567417593602", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.138 187122 DEBUG nova.network.os_vif_util [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.138 187122 DEBUG os_vif [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.140 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.141 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd3287295-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.142 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.144 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.147 187122 INFO os_vif [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:41:c1:9a,bridge_name='br-int',has_traffic_filtering=True,id=d3287295-b9fe-4bf9-bf10-567417593602,network=Network(ec4c56dd-0181-49fa-aa54-bd6c0e4050bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd3287295-b9')
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.147 187122 INFO nova.virt.libvirt.driver [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Deleting instance files /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e_del
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.148 187122 INFO nova.virt.libvirt.driver [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Deletion of /var/lib/nova/instances/239be499-c936-4d64-a260-7b5702d8709e_del complete
Nov 24 14:33:15 compute-0 podman[215645]: 2025-11-24 14:33:15.160427109 +0000 UTC m=+0.048848013 container remove 6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.166 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a7dc35-40cf-4f88-a73c-d443410b3a6b]: (4, ('Mon Nov 24 02:33:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc (6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb)\n6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb\nMon Nov 24 02:33:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc (6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb)\n6ae5030458032801c8637cb3af001c7244937c61634a1b5e78284fbd74290feb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.167 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c290642a-7859-48ce-a9bd-1352d9f460f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.168 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapec4c56dd-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.170 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:15 compute-0 kernel: tapec4c56dd-00: left promiscuous mode
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.188 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.190 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc4b9a7-267c-4b7b-85b6-bbdd7d940bfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.196 187122 INFO nova.compute.manager [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.197 187122 DEBUG oslo.service.loopingcall [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.197 187122 DEBUG nova.compute.manager [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.198 187122 DEBUG nova.network.neutron [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.213 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2c83124f-7a9c-4a38-b5ca-33dbceed5f5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.214 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2e2eaa-9f98-493c-bedc-5039e971c677]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.238 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e339c5d0-fa54-4609-942a-d0633a468423]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 299770, 'reachable_time': 17361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215662, 'error': None, 'target': 'ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.241 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ec4c56dd-0181-49fa-aa54-bd6c0e4050bc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:33:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:15.241 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[f05be9c5-50d1-4a66-82a3-62fd9567c60e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:15 compute-0 systemd[1]: run-netns-ovnmeta\x2dec4c56dd\x2d0181\x2d49fa\x2daa54\x2dbd6c0e4050bc.mount: Deactivated successfully.
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.438 187122 DEBUG nova.compute.manager [req-57e71eaa-6759-44f1-bd1b-5a71fd2322b0 req-a6f63a86-1e6d-4ac2-9441-d6af06ab905e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-vif-unplugged-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.439 187122 DEBUG oslo_concurrency.lockutils [req-57e71eaa-6759-44f1-bd1b-5a71fd2322b0 req-a6f63a86-1e6d-4ac2-9441-d6af06ab905e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.440 187122 DEBUG oslo_concurrency.lockutils [req-57e71eaa-6759-44f1-bd1b-5a71fd2322b0 req-a6f63a86-1e6d-4ac2-9441-d6af06ab905e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.440 187122 DEBUG oslo_concurrency.lockutils [req-57e71eaa-6759-44f1-bd1b-5a71fd2322b0 req-a6f63a86-1e6d-4ac2-9441-d6af06ab905e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.441 187122 DEBUG nova.compute.manager [req-57e71eaa-6759-44f1-bd1b-5a71fd2322b0 req-a6f63a86-1e6d-4ac2-9441-d6af06ab905e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] No waiting events found dispatching network-vif-unplugged-d3287295-b9fe-4bf9-bf10-567417593602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.441 187122 DEBUG nova.compute.manager [req-57e71eaa-6759-44f1-bd1b-5a71fd2322b0 req-a6f63a86-1e6d-4ac2-9441-d6af06ab905e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-vif-unplugged-d3287295-b9fe-4bf9-bf10-567417593602 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.741 187122 DEBUG nova.network.neutron [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.763 187122 INFO nova.compute.manager [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Took 0.57 seconds to deallocate network for instance.
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.830 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.831 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.902 187122 DEBUG nova.compute.provider_tree [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.926 187122 DEBUG nova.scheduler.client.report [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.953 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:15 compute-0 nova_compute[187118]: 2025-11-24 14:33:15.974 187122 INFO nova.scheduler.client.report [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 239be499-c936-4d64-a260-7b5702d8709e
Nov 24 14:33:16 compute-0 nova_compute[187118]: 2025-11-24 14:33:16.047 187122 DEBUG oslo_concurrency.lockutils [None req-89f45579-87d7-463f-a62d-566f2f305cb0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:16 compute-0 podman[215663]: 2025-11-24 14:33:16.492672753 +0000 UTC m=+0.099970576 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:33:16 compute-0 podman[215664]: 2025-11-24 14:33:16.507141988 +0000 UTC m=+0.112584771 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.518 187122 DEBUG nova.compute.manager [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.518 187122 DEBUG oslo_concurrency.lockutils [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "239be499-c936-4d64-a260-7b5702d8709e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.519 187122 DEBUG oslo_concurrency.lockutils [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.519 187122 DEBUG oslo_concurrency.lockutils [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "239be499-c936-4d64-a260-7b5702d8709e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.519 187122 DEBUG nova.compute.manager [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] No waiting events found dispatching network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.520 187122 WARNING nova.compute.manager [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received unexpected event network-vif-plugged-d3287295-b9fe-4bf9-bf10-567417593602 for instance with vm_state deleted and task_state None.
Nov 24 14:33:17 compute-0 nova_compute[187118]: 2025-11-24 14:33:17.520 187122 DEBUG nova.compute.manager [req-9dffabde-7f7c-4a41-b90b-22f3869cd449 req-f913f885-0835-477d-b86f-6d93f31903b0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Received event network-vif-deleted-d3287295-b9fe-4bf9-bf10-567417593602 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:19 compute-0 nova_compute[187118]: 2025-11-24 14:33:19.606 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:19 compute-0 nova_compute[187118]: 2025-11-24 14:33:19.674 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:19 compute-0 nova_compute[187118]: 2025-11-24 14:33:19.813 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:20 compute-0 nova_compute[187118]: 2025-11-24 14:33:20.143 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:21 compute-0 podman[215702]: 2025-11-24 14:33:21.475398041 +0000 UTC m=+0.074059960 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 14:33:22 compute-0 podman[215725]: 2025-11-24 14:33:22.533815321 +0000 UTC m=+0.136663037 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 14:33:24 compute-0 nova_compute[187118]: 2025-11-24 14:33:24.815 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:25 compute-0 nova_compute[187118]: 2025-11-24 14:33:25.145 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:26 compute-0 nova_compute[187118]: 2025-11-24 14:33:26.850 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994791.8485203, 9699fb56-9697-4926-9b5a-b883c523bbfb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:33:26 compute-0 nova_compute[187118]: 2025-11-24 14:33:26.850 187122 INFO nova.compute.manager [-] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] VM Stopped (Lifecycle Event)
Nov 24 14:33:26 compute-0 nova_compute[187118]: 2025-11-24 14:33:26.874 187122 DEBUG nova.compute.manager [None req-02fd35f3-8707-4bfa-9eb6-a298ba8783e1 - - - - - -] [instance: 9699fb56-9697-4926-9b5a-b883c523bbfb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:33:27 compute-0 podman[215751]: 2025-11-24 14:33:27.436443465 +0000 UTC m=+0.051283239 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:33:29 compute-0 nova_compute[187118]: 2025-11-24 14:33:29.816 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:30 compute-0 nova_compute[187118]: 2025-11-24 14:33:30.114 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994795.1135826, 239be499-c936-4d64-a260-7b5702d8709e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:33:30 compute-0 nova_compute[187118]: 2025-11-24 14:33:30.114 187122 INFO nova.compute.manager [-] [instance: 239be499-c936-4d64-a260-7b5702d8709e] VM Stopped (Lifecycle Event)
Nov 24 14:33:30 compute-0 nova_compute[187118]: 2025-11-24 14:33:30.140 187122 DEBUG nova.compute.manager [None req-73c4af8d-6d2d-4908-8239-b2ba8e533fcb - - - - - -] [instance: 239be499-c936-4d64-a260-7b5702d8709e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:33:30 compute-0 nova_compute[187118]: 2025-11-24 14:33:30.147 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:33 compute-0 nova_compute[187118]: 2025-11-24 14:33:33.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.519 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.520 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.537 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.616 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.617 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.627 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.627 187122 INFO nova.compute.claims [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.742 187122 DEBUG nova.compute.provider_tree [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.753 187122 DEBUG nova.scheduler.client.report [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.769 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.770 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.811 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.812 187122 DEBUG nova.network.neutron [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.818 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.825 187122 INFO nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.837 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.913 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.914 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.915 187122 INFO nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Creating image(s)
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.915 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.916 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.916 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:34 compute-0 nova_compute[187118]: 2025-11-24 14:33:34.933 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.006 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.008 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.009 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.028 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.095 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.096 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.128 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:33:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.133 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.135 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.136 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.157 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.201 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.202 187122 DEBUG nova.virt.disk.api [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.202 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.267 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.268 187122 DEBUG nova.virt.disk.api [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.269 187122 DEBUG nova.objects.instance [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.286 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.286 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Ensure instance console log exists: /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.287 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.287 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.287 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:35 compute-0 nova_compute[187118]: 2025-11-24 14:33:35.299 187122 DEBUG nova.policy [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.030 187122 DEBUG nova.network.neutron [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Successfully created port: 80657a89-07d8-4355-a80e-f13874579df8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.600 187122 DEBUG nova.network.neutron [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Successfully updated port: 80657a89-07d8-4355-a80e-f13874579df8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.618 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.618 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.619 187122 DEBUG nova.network.neutron [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.716 187122 DEBUG nova.compute.manager [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-changed-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.717 187122 DEBUG nova.compute.manager [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing instance network info cache due to event network-changed-80657a89-07d8-4355-a80e-f13874579df8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.718 187122 DEBUG oslo_concurrency.lockutils [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:36 compute-0 nova_compute[187118]: 2025-11-24 14:33:36.816 187122 DEBUG nova.network.neutron [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.656 187122 DEBUG nova.network.neutron [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.677 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.677 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Instance network_info: |[{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.678 187122 DEBUG oslo_concurrency.lockutils [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.679 187122 DEBUG nova.network.neutron [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing network info cache for port 80657a89-07d8-4355-a80e-f13874579df8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.684 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Start _get_guest_xml network_info=[{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.690 187122 WARNING nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.698 187122 DEBUG nova.virt.libvirt.host [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.699 187122 DEBUG nova.virt.libvirt.host [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.705 187122 DEBUG nova.virt.libvirt.host [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.705 187122 DEBUG nova.virt.libvirt.host [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.706 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.707 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.708 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.708 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.708 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.709 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.709 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.710 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.710 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.711 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.711 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.712 187122 DEBUG nova.virt.hardware [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.718 187122 DEBUG nova.virt.libvirt.vif [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:33:34Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.719 187122 DEBUG nova.network.os_vif_util [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.720 187122 DEBUG nova.network.os_vif_util [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.721 187122 DEBUG nova.objects.instance [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.732 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <uuid>70f125d3-772c-4512-89cd-87864bebf8cc</uuid>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <name>instance-00000006</name>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:33:37</nova:creationTime>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:33:37 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <system>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <entry name="serial">70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <entry name="uuid">70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </system>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <os>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </os>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <features>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </features>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:74:5b:4f"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <target dev="tap80657a89-07"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log" append="off"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <video>
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </video>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:33:37 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:33:37 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:33:37 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:33:37 compute-0 nova_compute[187118]: </domain>
Nov 24 14:33:37 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.733 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Preparing to wait for external event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.734 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.734 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.735 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.736 187122 DEBUG nova.virt.libvirt.vif [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:33:34Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.736 187122 DEBUG nova.network.os_vif_util [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.737 187122 DEBUG nova.network.os_vif_util [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.737 187122 DEBUG os_vif [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.738 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.739 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.739 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.742 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.742 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap80657a89-07, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.743 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap80657a89-07, col_values=(('external_ids', {'iface-id': '80657a89-07d8-4355-a80e-f13874579df8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:5b:4f', 'vm-uuid': '70f125d3-772c-4512-89cd-87864bebf8cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.744 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:37 compute-0 NetworkManager[55697]: <info>  [1763994817.7464] manager: (tap80657a89-07): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.747 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.751 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.752 187122 INFO os_vif [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07')
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.803 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.804 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.804 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:74:5b:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:33:37 compute-0 nova_compute[187118]: 2025-11-24 14:33:37.805 187122 INFO nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Using config drive
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.272 187122 INFO nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Creating config drive at /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.278 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1g1lelgk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.405 187122 DEBUG oslo_concurrency.processutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1g1lelgk" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:38 compute-0 kernel: tap80657a89-07: entered promiscuous mode
Nov 24 14:33:38 compute-0 NetworkManager[55697]: <info>  [1763994818.4648] manager: (tap80657a89-07): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Nov 24 14:33:38 compute-0 ovn_controller[95613]: 2025-11-24T14:33:38Z|00082|binding|INFO|Claiming lport 80657a89-07d8-4355-a80e-f13874579df8 for this chassis.
Nov 24 14:33:38 compute-0 ovn_controller[95613]: 2025-11-24T14:33:38Z|00083|binding|INFO|80657a89-07d8-4355-a80e-f13874579df8: Claiming fa:16:3e:74:5b:4f 10.100.0.5
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.466 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.471 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.479 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:5b:4f 10.100.0.5'], port_security=['fa:16:3e:74:5b:4f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '70f125d3-772c-4512-89cd-87864bebf8cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '006ec4d8-4baf-4197-8d42-e48ef06fa486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=404425a2-d90a-4f58-8342-049369e4c90c, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=80657a89-07d8-4355-a80e-f13874579df8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.481 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 80657a89-07d8-4355-a80e-f13874579df8 in datapath 88c27d4f-052b-4040-8dc7-91a7fc24ef8c bound to our chassis
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.481 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88c27d4f-052b-4040-8dc7-91a7fc24ef8c
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.494 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[6293b306-6a46-49b2-9c31-e79d9e5a95d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.495 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88c27d4f-01 in ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.497 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88c27d4f-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.497 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3d807b69-e414-4687-beb6-cb60375d5403]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.498 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d85d8a-5028-4d66-8963-53d907ee6457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.509 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[71db67a8-b221-4b12-b02a-c23255128873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 systemd-machined[153483]: New machine qemu-6-instance-00000006.
Nov 24 14:33:38 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.546 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[09063f16-f059-4fe6-ad2a-7fab99f5849b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.549 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 ovn_controller[95613]: 2025-11-24T14:33:38Z|00084|binding|INFO|Setting lport 80657a89-07d8-4355-a80e-f13874579df8 ovn-installed in OVS
Nov 24 14:33:38 compute-0 ovn_controller[95613]: 2025-11-24T14:33:38Z|00085|binding|INFO|Setting lport 80657a89-07d8-4355-a80e-f13874579df8 up in Southbound
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.553 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 systemd-udevd[215828]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:33:38 compute-0 NetworkManager[55697]: <info>  [1763994818.5694] device (tap80657a89-07): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:33:38 compute-0 NetworkManager[55697]: <info>  [1763994818.5703] device (tap80657a89-07): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.575 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[de9aae78-6f7e-46cb-8861-846177d90a81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.579 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7d18eb95-3fca-4086-8645-3f296e73991b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 systemd-udevd[215844]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:33:38 compute-0 NetworkManager[55697]: <info>  [1763994818.5804] manager: (tap88c27d4f-00): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Nov 24 14:33:38 compute-0 podman[215807]: 2025-11-24 14:33:38.607497511 +0000 UTC m=+0.102444653 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.607 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[d5a919f3-bccf-4ea1-b470-3be3ee60f180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.610 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd61e86-e49d-4c24-931c-6dc0cabf7772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 NetworkManager[55697]: <info>  [1763994818.6347] device (tap88c27d4f-00): carrier: link connected
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.639 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[e359e32d-ca51-467c-a7e4-cfdfce0091c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.656 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b3cdc28e-b70d-4ec7-b4fd-2fcfb33506ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88c27d4f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:da:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 308606, 'reachable_time': 39463, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215865, 'error': None, 'target': 'ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.671 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1e6c90-1af2-421b-98d8-720d1cba5964]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:da69'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 308606, 'tstamp': 308606}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215866, 'error': None, 'target': 'ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.689 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a9330543-4b78-4138-b1c1-b8f6229d2407]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88c27d4f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:da:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 308606, 'reachable_time': 39463, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215867, 'error': None, 'target': 'ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.721 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8dffca11-47b0-4470-a3b8-fcf8d19af6c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.767 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b395b565-2564-4b80-b8ea-623466261e42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.769 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88c27d4f-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.769 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.769 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88c27d4f-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:38 compute-0 kernel: tap88c27d4f-00: entered promiscuous mode
Nov 24 14:33:38 compute-0 NetworkManager[55697]: <info>  [1763994818.7715] manager: (tap88c27d4f-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.771 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.773 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88c27d4f-00, col_values=(('external_ids', {'iface-id': 'f0d428f1-79c6-415d-9945-8d2b6b384323'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.774 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 ovn_controller[95613]: 2025-11-24T14:33:38Z|00086|binding|INFO|Releasing lport f0d428f1-79c6-415d-9945-8d2b6b384323 from this chassis (sb_readonly=0)
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.785 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.786 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.786 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88c27d4f-052b-4040-8dc7-91a7fc24ef8c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88c27d4f-052b-4040-8dc7-91a7fc24ef8c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.787 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[aa53e68a-28ca-4863-92a8-f4cbcd3e928c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.788 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-88c27d4f-052b-4040-8dc7-91a7fc24ef8c
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/88c27d4f-052b-4040-8dc7-91a7fc24ef8c.pid.haproxy
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID 88c27d4f-052b-4040-8dc7-91a7fc24ef8c
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:33:38 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:38.788 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'env', 'PROCESS_TAG=haproxy-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88c27d4f-052b-4040-8dc7-91a7fc24ef8c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.809 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.810 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.810 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.815 187122 DEBUG nova.compute.manager [req-582a48fe-be09-4557-9ede-a049d8d2fce9 req-38058e49-07f6-4c95-a230-f2f96f99ff3c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.816 187122 DEBUG oslo_concurrency.lockutils [req-582a48fe-be09-4557-9ede-a049d8d2fce9 req-38058e49-07f6-4c95-a230-f2f96f99ff3c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.816 187122 DEBUG oslo_concurrency.lockutils [req-582a48fe-be09-4557-9ede-a049d8d2fce9 req-38058e49-07f6-4c95-a230-f2f96f99ff3c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.816 187122 DEBUG oslo_concurrency.lockutils [req-582a48fe-be09-4557-9ede-a049d8d2fce9 req-38058e49-07f6-4c95-a230-f2f96f99ff3c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.816 187122 DEBUG nova.compute.manager [req-582a48fe-be09-4557-9ede-a049d8d2fce9 req-38058e49-07f6-4c95-a230-f2f96f99ff3c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Processing event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.863 187122 DEBUG nova.network.neutron [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updated VIF entry in instance network info cache for port 80657a89-07d8-4355-a80e-f13874579df8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.864 187122 DEBUG nova.network.neutron [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:33:38 compute-0 nova_compute[187118]: 2025-11-24 14:33:38.880 187122 DEBUG oslo_concurrency.lockutils [req-795a3e26-7fb6-42ef-91c4-45a3284f6e6b req-a9b6e1fb-f561-412f-8a37-ad2eca6a3604 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.073 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.074 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994819.0739238, 70f125d3-772c-4512-89cd-87864bebf8cc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.075 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] VM Started (Lifecycle Event)
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.081 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.086 187122 INFO nova.virt.libvirt.driver [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Instance spawned successfully.
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.087 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.093 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.097 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.110 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.111 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.112 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.113 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.114 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.115 187122 DEBUG nova.virt.libvirt.driver [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.122 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.123 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994819.0740297, 70f125d3-772c-4512-89cd-87864bebf8cc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.123 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] VM Paused (Lifecycle Event)
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.153 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.156 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994819.0764594, 70f125d3-772c-4512-89cd-87864bebf8cc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.157 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] VM Resumed (Lifecycle Event)
Nov 24 14:33:39 compute-0 podman[215903]: 2025-11-24 14:33:39.170040384 +0000 UTC m=+0.045578224 container create cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.183 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.185 187122 INFO nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Took 4.27 seconds to spawn the instance on the hypervisor.
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.187 187122 DEBUG nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.190 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:33:39 compute-0 systemd[1]: Started libpod-conmon-cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d.scope.
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.220 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:33:39 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:33:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd0de350e27a4bc18a33fd6a9366a9729f17239c500ba1354741ca17159745bc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:33:39 compute-0 podman[215903]: 2025-11-24 14:33:39.240501975 +0000 UTC m=+0.116039825 container init cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 24 14:33:39 compute-0 podman[215903]: 2025-11-24 14:33:39.146210374 +0000 UTC m=+0.021748234 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:33:39 compute-0 podman[215903]: 2025-11-24 14:33:39.245604363 +0000 UTC m=+0.121142203 container start cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.247 187122 INFO nova.compute.manager [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Took 4.67 seconds to build instance.
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.259 187122 DEBUG oslo_concurrency.lockutils [None req-10db7ade-393d-4f4c-8b89-dd476fd27d02 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.740s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:39 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [NOTICE]   (215922) : New worker (215924) forked
Nov 24 14:33:39 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [NOTICE]   (215922) : Loading success.
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.819 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.821 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.822 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.822 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.822 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.902 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.957 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:39 compute-0 nova_compute[187118]: 2025-11-24 14:33:39.958 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.011 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.195 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.196 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.45804595947266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.197 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.197 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.277 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 70f125d3-772c-4512-89cd-87864bebf8cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.278 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.278 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.317 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.335 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.358 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.358 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.909 187122 DEBUG nova.compute.manager [req-57a535d5-da01-40a9-b407-70590c201eaa req-d30152a6-683a-4002-a9a0-6d7132611045 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.910 187122 DEBUG oslo_concurrency.lockutils [req-57a535d5-da01-40a9-b407-70590c201eaa req-d30152a6-683a-4002-a9a0-6d7132611045 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.911 187122 DEBUG oslo_concurrency.lockutils [req-57a535d5-da01-40a9-b407-70590c201eaa req-d30152a6-683a-4002-a9a0-6d7132611045 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.911 187122 DEBUG oslo_concurrency.lockutils [req-57a535d5-da01-40a9-b407-70590c201eaa req-d30152a6-683a-4002-a9a0-6d7132611045 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.912 187122 DEBUG nova.compute.manager [req-57a535d5-da01-40a9-b407-70590c201eaa req-d30152a6-683a-4002-a9a0-6d7132611045 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:33:40 compute-0 nova_compute[187118]: 2025-11-24 14:33:40.912 187122 WARNING nova.compute.manager [req-57a535d5-da01-40a9-b407-70590c201eaa req-d30152a6-683a-4002-a9a0-6d7132611045 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 for instance with vm_state active and task_state None.
Nov 24 14:33:41 compute-0 nova_compute[187118]: 2025-11-24 14:33:41.354 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:41 compute-0 nova_compute[187118]: 2025-11-24 14:33:41.355 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:41 compute-0 nova_compute[187118]: 2025-11-24 14:33:41.355 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:33:41 compute-0 NetworkManager[55697]: <info>  [1763994821.6983] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Nov 24 14:33:41 compute-0 ovn_controller[95613]: 2025-11-24T14:33:41Z|00087|binding|INFO|Releasing lport f0d428f1-79c6-415d-9945-8d2b6b384323 from this chassis (sb_readonly=0)
Nov 24 14:33:41 compute-0 NetworkManager[55697]: <info>  [1763994821.7015] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 24 14:33:41 compute-0 nova_compute[187118]: 2025-11-24 14:33:41.719 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:41 compute-0 nova_compute[187118]: 2025-11-24 14:33:41.735 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:41 compute-0 ovn_controller[95613]: 2025-11-24T14:33:41Z|00088|binding|INFO|Releasing lport f0d428f1-79c6-415d-9945-8d2b6b384323 from this chassis (sb_readonly=0)
Nov 24 14:33:41 compute-0 nova_compute[187118]: 2025-11-24 14:33:41.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:42 compute-0 podman[215941]: 2025-11-24 14:33:42.460339827 +0000 UTC m=+0.068905349 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:33:42 compute-0 nova_compute[187118]: 2025-11-24 14:33:42.745 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:42 compute-0 nova_compute[187118]: 2025-11-24 14:33:42.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:33:43 compute-0 nova_compute[187118]: 2025-11-24 14:33:43.002 187122 DEBUG nova.compute.manager [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-changed-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:33:43 compute-0 nova_compute[187118]: 2025-11-24 14:33:43.003 187122 DEBUG nova.compute.manager [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing instance network info cache due to event network-changed-80657a89-07d8-4355-a80e-f13874579df8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:33:43 compute-0 nova_compute[187118]: 2025-11-24 14:33:43.003 187122 DEBUG oslo_concurrency.lockutils [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:33:43 compute-0 nova_compute[187118]: 2025-11-24 14:33:43.003 187122 DEBUG oslo_concurrency.lockutils [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:33:43 compute-0 nova_compute[187118]: 2025-11-24 14:33:43.004 187122 DEBUG nova.network.neutron [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing network info cache for port 80657a89-07d8-4355-a80e-f13874579df8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:33:44 compute-0 nova_compute[187118]: 2025-11-24 14:33:44.008 187122 DEBUG nova.network.neutron [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updated VIF entry in instance network info cache for port 80657a89-07d8-4355-a80e-f13874579df8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:33:44 compute-0 nova_compute[187118]: 2025-11-24 14:33:44.009 187122 DEBUG nova.network.neutron [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:33:44 compute-0 nova_compute[187118]: 2025-11-24 14:33:44.026 187122 DEBUG oslo_concurrency.lockutils [req-30f49ebc-cb0d-4f88-8481-6095bf227d7d req-39903330-3905-480f-8ffc-027af3f1f1a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:33:44 compute-0 nova_compute[187118]: 2025-11-24 14:33:44.820 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:47 compute-0 podman[215959]: 2025-11-24 14:33:47.446535269 +0000 UTC m=+0.058514776 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:33:47 compute-0 podman[215960]: 2025-11-24 14:33:47.472531477 +0000 UTC m=+0.083074445 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 14:33:47 compute-0 nova_compute[187118]: 2025-11-24 14:33:47.749 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:49 compute-0 nova_compute[187118]: 2025-11-24 14:33:49.823 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:51 compute-0 ovn_controller[95613]: 2025-11-24T14:33:51Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:5b:4f 10.100.0.5
Nov 24 14:33:51 compute-0 ovn_controller[95613]: 2025-11-24T14:33:51Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:5b:4f 10.100.0.5
Nov 24 14:33:52 compute-0 podman[216012]: 2025-11-24 14:33:52.45803337 +0000 UTC m=+0.062046802 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 24 14:33:52 compute-0 nova_compute[187118]: 2025-11-24 14:33:52.752 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:53 compute-0 podman[216034]: 2025-11-24 14:33:53.481928609 +0000 UTC m=+0.090839416 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:33:54 compute-0 nova_compute[187118]: 2025-11-24 14:33:54.827 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:56.661 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:56.663 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:56.664 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:33:56 compute-0 nova_compute[187118]: 2025-11-24 14:33:56.672 187122 INFO nova.compute.manager [None req-dade14a7-9025-4c9a-b8ab-61b5f83e900c ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Get console output
Nov 24 14:33:56 compute-0 nova_compute[187118]: 2025-11-24 14:33:56.678 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:33:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:57.398 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:33:57 compute-0 nova_compute[187118]: 2025-11-24 14:33:57.399 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:57.400 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:33:57 compute-0 nova_compute[187118]: 2025-11-24 14:33:57.752 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:33:58 compute-0 podman[216061]: 2025-11-24 14:33:58.463008943 +0000 UTC m=+0.063632155 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:33:59 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:33:59.402 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:33:59 compute-0 nova_compute[187118]: 2025-11-24 14:33:59.770 187122 DEBUG oslo_concurrency.lockutils [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "interface-70f125d3-772c-4512-89cd-87864bebf8cc-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:33:59 compute-0 nova_compute[187118]: 2025-11-24 14:33:59.771 187122 DEBUG oslo_concurrency.lockutils [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-70f125d3-772c-4512-89cd-87864bebf8cc-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:33:59 compute-0 nova_compute[187118]: 2025-11-24 14:33:59.771 187122 DEBUG nova.objects.instance [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'flavor' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:33:59 compute-0 nova_compute[187118]: 2025-11-24 14:33:59.829 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:00 compute-0 nova_compute[187118]: 2025-11-24 14:34:00.248 187122 DEBUG nova.objects.instance [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_requests' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:34:00 compute-0 nova_compute[187118]: 2025-11-24 14:34:00.259 187122 DEBUG nova.network.neutron [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:34:00 compute-0 nova_compute[187118]: 2025-11-24 14:34:00.481 187122 DEBUG nova.policy [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.028 187122 DEBUG nova.network.neutron [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Successfully created port: fe1e3b21-532c-47fd-89c8-481678f2454b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.749 187122 DEBUG nova.network.neutron [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Successfully updated port: fe1e3b21-532c-47fd-89c8-481678f2454b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.763 187122 DEBUG oslo_concurrency.lockutils [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.763 187122 DEBUG oslo_concurrency.lockutils [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.763 187122 DEBUG nova.network.neutron [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.892 187122 DEBUG nova.compute.manager [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-changed-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.892 187122 DEBUG nova.compute.manager [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing instance network info cache due to event network-changed-fe1e3b21-532c-47fd-89c8-481678f2454b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:34:01 compute-0 nova_compute[187118]: 2025-11-24 14:34:01.893 187122 DEBUG oslo_concurrency.lockutils [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:34:02 compute-0 nova_compute[187118]: 2025-11-24 14:34:02.755 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:04 compute-0 nova_compute[187118]: 2025-11-24 14:34:04.831 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.359 187122 DEBUG nova.network.neutron [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.384 187122 DEBUG oslo_concurrency.lockutils [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.385 187122 DEBUG oslo_concurrency.lockutils [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.386 187122 DEBUG nova.network.neutron [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing network info cache for port fe1e3b21-532c-47fd-89c8-481678f2454b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.389 187122 DEBUG nova.virt.libvirt.vif [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.389 187122 DEBUG nova.network.os_vif_util [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.390 187122 DEBUG nova.network.os_vif_util [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.390 187122 DEBUG os_vif [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.391 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.391 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.392 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.395 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.395 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe1e3b21-53, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.395 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe1e3b21-53, col_values=(('external_ids', {'iface-id': 'fe1e3b21-532c-47fd-89c8-481678f2454b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f5:6e:76', 'vm-uuid': '70f125d3-772c-4512-89cd-87864bebf8cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.3974] manager: (tapfe1e3b21-53): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.400 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.401 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.406 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.406 187122 INFO os_vif [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53')
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.407 187122 DEBUG nova.virt.libvirt.vif [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.407 187122 DEBUG nova.network.os_vif_util [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.408 187122 DEBUG nova.network.os_vif_util [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.410 187122 DEBUG nova.virt.libvirt.guest [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] attach device xml: <interface type="ethernet">
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <mac address="fa:16:3e:f5:6e:76"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <model type="virtio"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <mtu size="1442"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <target dev="tapfe1e3b21-53"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]: </interface>
Nov 24 14:34:06 compute-0 nova_compute[187118]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Nov 24 14:34:06 compute-0 kernel: tapfe1e3b21-53: entered promiscuous mode
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.4238] manager: (tapfe1e3b21-53): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.424 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 ovn_controller[95613]: 2025-11-24T14:34:06Z|00089|binding|INFO|Claiming lport fe1e3b21-532c-47fd-89c8-481678f2454b for this chassis.
Nov 24 14:34:06 compute-0 ovn_controller[95613]: 2025-11-24T14:34:06Z|00090|binding|INFO|fe1e3b21-532c-47fd-89c8-481678f2454b: Claiming fa:16:3e:f5:6e:76 10.100.0.21
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.428 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.440 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:6e:76 10.100.0.21'], port_security=['fa:16:3e:f5:6e:76 10.100.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '70f125d3-772c-4512-89cd-87864bebf8cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cac3ec3-3f65-41d6-96cd-4c08dbf282d0, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=fe1e3b21-532c-47fd-89c8-481678f2454b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.441 104469 INFO neutron.agent.ovn.metadata.agent [-] Port fe1e3b21-532c-47fd-89c8-481678f2454b in datapath ab0b30f6-b57a-4fe9-b7c2-d307773590ec bound to our chassis
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.442 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ab0b30f6-b57a-4fe9-b7c2-d307773590ec
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.455 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[eb0c9826-1e6d-422f-be2d-5e3d47cb272a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.457 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapab0b30f6-b1 in ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:34:06 compute-0 systemd-udevd[216092]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.459 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.459 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapab0b30f6-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.459 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[23f471a4-1591-4e1d-ad0c-d1bf570b2beb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.461 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac12076-c18a-4f05-9f41-7666852741f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_controller[95613]: 2025-11-24T14:34:06Z|00091|binding|INFO|Setting lport fe1e3b21-532c-47fd-89c8-481678f2454b ovn-installed in OVS
Nov 24 14:34:06 compute-0 ovn_controller[95613]: 2025-11-24T14:34:06Z|00092|binding|INFO|Setting lport fe1e3b21-532c-47fd-89c8-481678f2454b up in Southbound
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.463 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.4724] device (tapfe1e3b21-53): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.4733] device (tapfe1e3b21-53): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.475 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[cd71ab85-8a7d-467d-972c-c11bf5af9aa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.497 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[cb0b34f3-0dc1-4503-84f6-5018f6839df7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.509 187122 DEBUG nova.virt.libvirt.driver [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.509 187122 DEBUG nova.virt.libvirt.driver [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.510 187122 DEBUG nova.virt.libvirt.driver [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:74:5b:4f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.510 187122 DEBUG nova.virt.libvirt.driver [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:f5:6e:76, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.526 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6cd276-a7ec-49a2-86ea-ff113f6cd540]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.529 187122 DEBUG nova.virt.libvirt.guest [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:34:06</nova:creationTime>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:34:06 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     <nova:port uuid="fe1e3b21-532c-47fd-89c8-481678f2454b">
Nov 24 14:34:06 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 24 14:34:06 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:34:06 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:34:06 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:34:06 compute-0 nova_compute[187118]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.531 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ce242ebc-b576-40ab-b8a2-273be3ea0539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.5322] manager: (tapab0b30f6-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.551 187122 DEBUG oslo_concurrency.lockutils [None req-83a41cfa-7033-4175-ac7c-f97824477d6f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-70f125d3-772c-4512-89cd-87864bebf8cc-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.559 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[dafd1161-4e36-44f2-8a56-c0572b5370b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.561 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7e5ed2-59a6-4603-b1a0-a0605deaa565]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.5785] device (tapab0b30f6-b0): carrier: link connected
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.582 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[e2236234-0168-4100-bef5-2e76cb92fbd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.596 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[66b6d9ab-4b3a-47d5-befb-096304cb798b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab0b30f6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:db:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 311400, 'reachable_time': 28372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216118, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.612 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0bb8dd4e-7d67-4782-afdd-a7eabaaac15d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe66:db85'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 311400, 'tstamp': 311400}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216119, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.625 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f2fcf2c0-da12-4de7-b050-ab52ef4a6a17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab0b30f6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:db:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 311400, 'reachable_time': 28372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216120, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.665 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c026012c-3b4d-4b7d-86a9-3aa2887bfbfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.733 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3876d8f3-3305-4ee3-94a5-7e6a23c1cdb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.734 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0b30f6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.734 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.735 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab0b30f6-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.736 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 NetworkManager[55697]: <info>  [1763994846.7373] manager: (tapab0b30f6-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 24 14:34:06 compute-0 kernel: tapab0b30f6-b0: entered promiscuous mode
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.741 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapab0b30f6-b0, col_values=(('external_ids', {'iface-id': '2a7bcd0d-ee04-4834-b94b-d2234a1d3740'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:06 compute-0 ovn_controller[95613]: 2025-11-24T14:34:06Z|00093|binding|INFO|Releasing lport 2a7bcd0d-ee04-4834-b94b-d2234a1d3740 from this chassis (sb_readonly=0)
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.743 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.744 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.747 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ab0b30f6-b57a-4fe9-b7c2-d307773590ec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ab0b30f6-b57a-4fe9-b7c2-d307773590ec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.748 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7b39506e-9a0c-4556-90d1-42855b1888e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.749 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-ab0b30f6-b57a-4fe9-b7c2-d307773590ec
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/ab0b30f6-b57a-4fe9-b7c2-d307773590ec.pid.haproxy
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID ab0b30f6-b57a-4fe9-b7c2-d307773590ec
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:34:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:06.750 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'env', 'PROCESS_TAG=haproxy-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ab0b30f6-b57a-4fe9-b7c2-d307773590ec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:34:06 compute-0 nova_compute[187118]: 2025-11-24 14:34:06.755 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:07 compute-0 podman[216152]: 2025-11-24 14:34:07.096125943 +0000 UTC m=+0.045168432 container create f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:34:07 compute-0 systemd[1]: Started libpod-conmon-f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a.scope.
Nov 24 14:34:07 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:34:07 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e657a38b0bbc3b680db9fa820f06b3bc61f5c28c379c1688570d47e734ae638/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:34:07 compute-0 podman[216152]: 2025-11-24 14:34:07.06999141 +0000 UTC m=+0.019033919 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:34:07 compute-0 podman[216152]: 2025-11-24 14:34:07.175573269 +0000 UTC m=+0.124615778 container init f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, tcib_managed=true)
Nov 24 14:34:07 compute-0 podman[216152]: 2025-11-24 14:34:07.185458568 +0000 UTC m=+0.134501047 container start f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:34:07 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [NOTICE]   (216171) : New worker (216173) forked
Nov 24 14:34:07 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [NOTICE]   (216171) : Loading success.
Nov 24 14:34:07 compute-0 nova_compute[187118]: 2025-11-24 14:34:07.519 187122 DEBUG nova.compute.manager [req-68edd1f2-2b90-4ce9-93aa-e0cc722b237a req-cac1fdc1-59d8-4222-a93b-d1a0a7a5518f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:07 compute-0 nova_compute[187118]: 2025-11-24 14:34:07.520 187122 DEBUG oslo_concurrency.lockutils [req-68edd1f2-2b90-4ce9-93aa-e0cc722b237a req-cac1fdc1-59d8-4222-a93b-d1a0a7a5518f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:07 compute-0 nova_compute[187118]: 2025-11-24 14:34:07.520 187122 DEBUG oslo_concurrency.lockutils [req-68edd1f2-2b90-4ce9-93aa-e0cc722b237a req-cac1fdc1-59d8-4222-a93b-d1a0a7a5518f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:07 compute-0 nova_compute[187118]: 2025-11-24 14:34:07.521 187122 DEBUG oslo_concurrency.lockutils [req-68edd1f2-2b90-4ce9-93aa-e0cc722b237a req-cac1fdc1-59d8-4222-a93b-d1a0a7a5518f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:07 compute-0 nova_compute[187118]: 2025-11-24 14:34:07.521 187122 DEBUG nova.compute.manager [req-68edd1f2-2b90-4ce9-93aa-e0cc722b237a req-cac1fdc1-59d8-4222-a93b-d1a0a7a5518f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:34:07 compute-0 nova_compute[187118]: 2025-11-24 14:34:07.522 187122 WARNING nova.compute.manager [req-68edd1f2-2b90-4ce9-93aa-e0cc722b237a req-cac1fdc1-59d8-4222-a93b-d1a0a7a5518f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b for instance with vm_state active and task_state None.
Nov 24 14:34:08 compute-0 ovn_controller[95613]: 2025-11-24T14:34:08Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f5:6e:76 10.100.0.21
Nov 24 14:34:08 compute-0 ovn_controller[95613]: 2025-11-24T14:34:08Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f5:6e:76 10.100.0.21
Nov 24 14:34:08 compute-0 nova_compute[187118]: 2025-11-24 14:34:08.644 187122 DEBUG nova.network.neutron [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updated VIF entry in instance network info cache for port fe1e3b21-532c-47fd-89c8-481678f2454b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:34:08 compute-0 nova_compute[187118]: 2025-11-24 14:34:08.644 187122 DEBUG nova.network.neutron [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:34:08 compute-0 nova_compute[187118]: 2025-11-24 14:34:08.658 187122 DEBUG oslo_concurrency.lockutils [req-30757e2e-9a60-46d1-8d7d-8d9ba18bae4f req-9507551c-42f4-4b79-b131-c13bef780c19 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:34:09 compute-0 podman[216182]: 2025-11-24 14:34:09.464476149 +0000 UTC m=+0.062203637 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.799 187122 DEBUG nova.compute.manager [req-ef270f07-5ea5-4b4b-9b45-b0067899caeb req-66a2a8c5-1103-4576-8798-63ed8368aad0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.799 187122 DEBUG oslo_concurrency.lockutils [req-ef270f07-5ea5-4b4b-9b45-b0067899caeb req-66a2a8c5-1103-4576-8798-63ed8368aad0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.799 187122 DEBUG oslo_concurrency.lockutils [req-ef270f07-5ea5-4b4b-9b45-b0067899caeb req-66a2a8c5-1103-4576-8798-63ed8368aad0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.800 187122 DEBUG oslo_concurrency.lockutils [req-ef270f07-5ea5-4b4b-9b45-b0067899caeb req-66a2a8c5-1103-4576-8798-63ed8368aad0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.800 187122 DEBUG nova.compute.manager [req-ef270f07-5ea5-4b4b-9b45-b0067899caeb req-66a2a8c5-1103-4576-8798-63ed8368aad0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.800 187122 WARNING nova.compute.manager [req-ef270f07-5ea5-4b4b-9b45-b0067899caeb req-66a2a8c5-1103-4576-8798-63ed8368aad0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b for instance with vm_state active and task_state None.
Nov 24 14:34:09 compute-0 nova_compute[187118]: 2025-11-24 14:34:09.833 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:11 compute-0 nova_compute[187118]: 2025-11-24 14:34:11.398 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:13 compute-0 podman[216207]: 2025-11-24 14:34:13.451760684 +0000 UTC m=+0.056006078 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.536 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.536 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.555 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.639 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.640 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.653 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.654 187122 INFO nova.compute.claims [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.771 187122 DEBUG nova.compute.provider_tree [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.783 187122 DEBUG nova.scheduler.client.report [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.804 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.805 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.835 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.855 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.856 187122 DEBUG nova.network.neutron [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.873 187122 INFO nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.888 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.970 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.971 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.972 187122 INFO nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Creating image(s)
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.972 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.973 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.973 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:14 compute-0 nova_compute[187118]: 2025-11-24 14:34:14.991 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.043 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.043 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.044 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.054 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.104 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.104 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.134 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.135 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.135 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.184 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.185 187122 DEBUG nova.virt.disk.api [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.186 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.237 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.238 187122 DEBUG nova.virt.disk.api [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.238 187122 DEBUG nova.objects.instance [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 9b3efeab-7379-4e78-8df8-032e6e66cd67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.250 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.250 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Ensure instance console log exists: /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.251 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.251 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.251 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.255 187122 DEBUG nova.policy [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:34:15 compute-0 nova_compute[187118]: 2025-11-24 14:34:15.852 187122 DEBUG nova.network.neutron [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Successfully created port: 567cb66f-ac48-449e-accf-08c9a578a66c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.405 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.435 187122 DEBUG nova.network.neutron [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Successfully updated port: 567cb66f-ac48-449e-accf-08c9a578a66c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.450 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-9b3efeab-7379-4e78-8df8-032e6e66cd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.450 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-9b3efeab-7379-4e78-8df8-032e6e66cd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.450 187122 DEBUG nova.network.neutron [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.529 187122 DEBUG nova.compute.manager [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-changed-567cb66f-ac48-449e-accf-08c9a578a66c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.529 187122 DEBUG nova.compute.manager [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Refreshing instance network info cache due to event network-changed-567cb66f-ac48-449e-accf-08c9a578a66c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.530 187122 DEBUG oslo_concurrency.lockutils [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-9b3efeab-7379-4e78-8df8-032e6e66cd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:34:16 compute-0 nova_compute[187118]: 2025-11-24 14:34:16.598 187122 DEBUG nova.network.neutron [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.424 187122 DEBUG nova.network.neutron [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Updating instance_info_cache with network_info: [{"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.442 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-9b3efeab-7379-4e78-8df8-032e6e66cd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.442 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Instance network_info: |[{"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.443 187122 DEBUG oslo_concurrency.lockutils [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-9b3efeab-7379-4e78-8df8-032e6e66cd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.443 187122 DEBUG nova.network.neutron [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Refreshing network info cache for port 567cb66f-ac48-449e-accf-08c9a578a66c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.445 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Start _get_guest_xml network_info=[{"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.450 187122 WARNING nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.459 187122 DEBUG nova.virt.libvirt.host [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.460 187122 DEBUG nova.virt.libvirt.host [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.464 187122 DEBUG nova.virt.libvirt.host [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.465 187122 DEBUG nova.virt.libvirt.host [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.465 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.466 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.466 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.466 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.467 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.467 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.467 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.467 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.468 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.468 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.468 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.468 187122 DEBUG nova.virt.hardware [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.472 187122 DEBUG nova.virt.libvirt.vif [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:34:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-341495495',display_name='tempest-TestNetworkBasicOps-server-341495495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-341495495',id=7,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOZkxL5wi9Q/u/qvNDr2KmRfqGzm1BMXjeAzoCz5U2QtADvT2sbNAlUmFRNErLKp6Wu4BBlqdVbN2TTC5tMIEqal5FJUQiOsx1yeNac03Gqj1PJHUBfugrNfq2yRaju5gA==',key_name='tempest-TestNetworkBasicOps-471487238',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-3h48y00q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:34:14Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9b3efeab-7379-4e78-8df8-032e6e66cd67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.473 187122 DEBUG nova.network.os_vif_util [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.474 187122 DEBUG nova.network.os_vif_util [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.475 187122 DEBUG nova.objects.instance [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9b3efeab-7379-4e78-8df8-032e6e66cd67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.493 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <uuid>9b3efeab-7379-4e78-8df8-032e6e66cd67</uuid>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <name>instance-00000007</name>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-341495495</nova:name>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:34:17</nova:creationTime>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         <nova:port uuid="567cb66f-ac48-449e-accf-08c9a578a66c">
Nov 24 14:34:17 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <system>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <entry name="serial">9b3efeab-7379-4e78-8df8-032e6e66cd67</entry>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <entry name="uuid">9b3efeab-7379-4e78-8df8-032e6e66cd67</entry>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </system>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <os>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </os>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <features>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </features>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.config"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:85:0a:c9"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <target dev="tap567cb66f-ac"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/console.log" append="off"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <video>
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </video>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:34:17 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:34:17 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:34:17 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:34:17 compute-0 nova_compute[187118]: </domain>
Nov 24 14:34:17 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.494 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Preparing to wait for external event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.495 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.495 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.495 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.496 187122 DEBUG nova.virt.libvirt.vif [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:34:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-341495495',display_name='tempest-TestNetworkBasicOps-server-341495495',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-341495495',id=7,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOZkxL5wi9Q/u/qvNDr2KmRfqGzm1BMXjeAzoCz5U2QtADvT2sbNAlUmFRNErLKp6Wu4BBlqdVbN2TTC5tMIEqal5FJUQiOsx1yeNac03Gqj1PJHUBfugrNfq2yRaju5gA==',key_name='tempest-TestNetworkBasicOps-471487238',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-3h48y00q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:34:14Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9b3efeab-7379-4e78-8df8-032e6e66cd67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.496 187122 DEBUG nova.network.os_vif_util [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.497 187122 DEBUG nova.network.os_vif_util [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.497 187122 DEBUG os_vif [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.497 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.498 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.498 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.500 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.501 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap567cb66f-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.501 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap567cb66f-ac, col_values=(('external_ids', {'iface-id': '567cb66f-ac48-449e-accf-08c9a578a66c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:0a:c9', 'vm-uuid': '9b3efeab-7379-4e78-8df8-032e6e66cd67'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.503 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:17 compute-0 NetworkManager[55697]: <info>  [1763994857.5040] manager: (tap567cb66f-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.505 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.509 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.509 187122 INFO os_vif [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac')
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.551 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.551 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.552 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:85:0a:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.552 187122 INFO nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Using config drive
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.815 187122 INFO nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Creating config drive at /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.config
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.820 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsi6f4tt3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:17 compute-0 nova_compute[187118]: 2025-11-24 14:34:17.958 187122 DEBUG oslo_concurrency.processutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsi6f4tt3" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:18 compute-0 kernel: tap567cb66f-ac: entered promiscuous mode
Nov 24 14:34:18 compute-0 NetworkManager[55697]: <info>  [1763994858.0331] manager: (tap567cb66f-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Nov 24 14:34:18 compute-0 ovn_controller[95613]: 2025-11-24T14:34:18Z|00094|binding|INFO|Claiming lport 567cb66f-ac48-449e-accf-08c9a578a66c for this chassis.
Nov 24 14:34:18 compute-0 ovn_controller[95613]: 2025-11-24T14:34:18Z|00095|binding|INFO|567cb66f-ac48-449e-accf-08c9a578a66c: Claiming fa:16:3e:85:0a:c9 10.100.0.25
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.041 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.051 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:0a:c9 10.100.0.25'], port_security=['fa:16:3e:85:0a:c9 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '9b3efeab-7379-4e78-8df8-032e6e66cd67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '03bfcc80-c0c3-4e20-9635-e8e21409c08e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cac3ec3-3f65-41d6-96cd-4c08dbf282d0, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=567cb66f-ac48-449e-accf-08c9a578a66c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.052 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 567cb66f-ac48-449e-accf-08c9a578a66c in datapath ab0b30f6-b57a-4fe9-b7c2-d307773590ec bound to our chassis
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.053 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ab0b30f6-b57a-4fe9-b7c2-d307773590ec
Nov 24 14:34:18 compute-0 ovn_controller[95613]: 2025-11-24T14:34:18Z|00096|binding|INFO|Setting lport 567cb66f-ac48-449e-accf-08c9a578a66c ovn-installed in OVS
Nov 24 14:34:18 compute-0 ovn_controller[95613]: 2025-11-24T14:34:18Z|00097|binding|INFO|Setting lport 567cb66f-ac48-449e-accf-08c9a578a66c up in Southbound
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.072 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.077 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:18 compute-0 systemd-udevd[216282]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.080 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[71d375e0-e949-4819-85f3-73b00ebca929]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:18 compute-0 systemd-machined[153483]: New machine qemu-7-instance-00000007.
Nov 24 14:34:18 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Nov 24 14:34:18 compute-0 NetworkManager[55697]: <info>  [1763994858.1101] device (tap567cb66f-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:34:18 compute-0 NetworkManager[55697]: <info>  [1763994858.1107] device (tap567cb66f-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:34:18 compute-0 podman[216253]: 2025-11-24 14:34:18.115924568 +0000 UTC m=+0.103761839 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.120 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d9875a-7558-435f-acf3-0e41b927cdfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.123 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[b6ebc301-1849-4a1e-b393-4a677e5e03dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:18 compute-0 podman[216254]: 2025-11-24 14:34:18.125815117 +0000 UTC m=+0.097559559 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible)
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.153 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[d0832673-d429-4864-badc-9849556990a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.168 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0fa17957-20c5-4f8f-8e68-64176d4c2d7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab0b30f6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:db:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 311400, 'reachable_time': 28372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216309, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.184 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8958efa9-facf-41a3-9650-476b0de7814d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapab0b30f6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 311412, 'tstamp': 311412}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216311, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapab0b30f6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 311415, 'tstamp': 311415}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216311, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.186 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0b30f6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.187 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.188 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.189 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab0b30f6-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.189 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.189 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapab0b30f6-b0, col_values=(('external_ids', {'iface-id': '2a7bcd0d-ee04-4834-b94b-d2234a1d3740'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:34:18 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:18.189 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.390 187122 DEBUG nova.compute.manager [req-88a89737-73e8-4374-abd0-322780e57d1f req-94c0bb06-a232-45b4-9bd5-0afa30c5dcd0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.390 187122 DEBUG oslo_concurrency.lockutils [req-88a89737-73e8-4374-abd0-322780e57d1f req-94c0bb06-a232-45b4-9bd5-0afa30c5dcd0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.390 187122 DEBUG oslo_concurrency.lockutils [req-88a89737-73e8-4374-abd0-322780e57d1f req-94c0bb06-a232-45b4-9bd5-0afa30c5dcd0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.390 187122 DEBUG oslo_concurrency.lockutils [req-88a89737-73e8-4374-abd0-322780e57d1f req-94c0bb06-a232-45b4-9bd5-0afa30c5dcd0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.390 187122 DEBUG nova.compute.manager [req-88a89737-73e8-4374-abd0-322780e57d1f req-94c0bb06-a232-45b4-9bd5-0afa30c5dcd0 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Processing event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.480 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.481 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994858.4791992, 9b3efeab-7379-4e78-8df8-032e6e66cd67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.481 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] VM Started (Lifecycle Event)
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.486 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.489 187122 INFO nova.virt.libvirt.driver [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Instance spawned successfully.
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.489 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.538 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.544 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.555 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.556 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.556 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.557 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.558 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.558 187122 DEBUG nova.virt.libvirt.driver [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.564 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.565 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994858.4840662, 9b3efeab-7379-4e78-8df8-032e6e66cd67 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.565 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] VM Paused (Lifecycle Event)
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.585 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.588 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994858.4863236, 9b3efeab-7379-4e78-8df8-032e6e66cd67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.589 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] VM Resumed (Lifecycle Event)
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.606 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.610 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.616 187122 INFO nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Took 3.65 seconds to spawn the instance on the hypervisor.
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.616 187122 DEBUG nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.625 187122 DEBUG nova.network.neutron [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Updated VIF entry in instance network info cache for port 567cb66f-ac48-449e-accf-08c9a578a66c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.625 187122 DEBUG nova.network.neutron [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Updating instance_info_cache with network_info: [{"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.633 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.660 187122 DEBUG oslo_concurrency.lockutils [req-e3385784-bf65-4616-a33b-ef757da8d2dd req-21e4e7ff-0dac-401d-9933-37bc5638a12e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-9b3efeab-7379-4e78-8df8-032e6e66cd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.679 187122 INFO nova.compute.manager [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Took 4.07 seconds to build instance.
Nov 24 14:34:18 compute-0 nova_compute[187118]: 2025-11-24 14:34:18.692 187122 DEBUG oslo_concurrency.lockutils [None req-87f1a3b3-9bc4-4a06-b82f-38fb9e19bb0e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:19 compute-0 nova_compute[187118]: 2025-11-24 14:34:19.840 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:20 compute-0 nova_compute[187118]: 2025-11-24 14:34:20.505 187122 DEBUG nova.compute.manager [req-c5c02cb6-9128-4dc8-94ba-cd22932791cb req-a0100600-4da0-456b-ac0c-a38196e7605b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:20 compute-0 nova_compute[187118]: 2025-11-24 14:34:20.506 187122 DEBUG oslo_concurrency.lockutils [req-c5c02cb6-9128-4dc8-94ba-cd22932791cb req-a0100600-4da0-456b-ac0c-a38196e7605b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:20 compute-0 nova_compute[187118]: 2025-11-24 14:34:20.507 187122 DEBUG oslo_concurrency.lockutils [req-c5c02cb6-9128-4dc8-94ba-cd22932791cb req-a0100600-4da0-456b-ac0c-a38196e7605b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:20 compute-0 nova_compute[187118]: 2025-11-24 14:34:20.507 187122 DEBUG oslo_concurrency.lockutils [req-c5c02cb6-9128-4dc8-94ba-cd22932791cb req-a0100600-4da0-456b-ac0c-a38196e7605b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:20 compute-0 nova_compute[187118]: 2025-11-24 14:34:20.508 187122 DEBUG nova.compute.manager [req-c5c02cb6-9128-4dc8-94ba-cd22932791cb req-a0100600-4da0-456b-ac0c-a38196e7605b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] No waiting events found dispatching network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:34:20 compute-0 nova_compute[187118]: 2025-11-24 14:34:20.509 187122 WARNING nova.compute.manager [req-c5c02cb6-9128-4dc8-94ba-cd22932791cb req-a0100600-4da0-456b-ac0c-a38196e7605b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received unexpected event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c for instance with vm_state active and task_state None.
Nov 24 14:34:22 compute-0 nova_compute[187118]: 2025-11-24 14:34:22.503 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:23 compute-0 podman[216319]: 2025-11-24 14:34:23.532518081 +0000 UTC m=+0.135170505 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Nov 24 14:34:23 compute-0 podman[216338]: 2025-11-24 14:34:23.670944225 +0000 UTC m=+0.136330978 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:34:24 compute-0 nova_compute[187118]: 2025-11-24 14:34:24.840 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:27 compute-0 nova_compute[187118]: 2025-11-24 14:34:27.505 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:29 compute-0 podman[216365]: 2025-11-24 14:34:29.481502157 +0000 UTC m=+0.072562269 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:34:29 compute-0 nova_compute[187118]: 2025-11-24 14:34:29.842 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:31 compute-0 nova_compute[187118]: 2025-11-24 14:34:31.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:31 compute-0 ovn_controller[95613]: 2025-11-24T14:34:31Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:0a:c9 10.100.0.25
Nov 24 14:34:31 compute-0 ovn_controller[95613]: 2025-11-24T14:34:31Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:0a:c9 10.100.0.25
Nov 24 14:34:32 compute-0 nova_compute[187118]: 2025-11-24 14:34:32.507 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:34 compute-0 nova_compute[187118]: 2025-11-24 14:34:34.846 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:35 compute-0 nova_compute[187118]: 2025-11-24 14:34:35.808 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:37 compute-0 nova_compute[187118]: 2025-11-24 14:34:37.509 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:37 compute-0 nova_compute[187118]: 2025-11-24 14:34:37.791 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:37 compute-0 nova_compute[187118]: 2025-11-24 14:34:37.823 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.821 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.822 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.823 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.823 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.848 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.892 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:39 compute-0 podman[216400]: 2025-11-24 14:34:39.921339572 +0000 UTC m=+0.051406443 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.975 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:39 compute-0 nova_compute[187118]: 2025-11-24 14:34:39.976 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.025 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.031 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.118 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.119 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.188 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.432 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.433 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5437MB free_disk=73.40046310424805GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.433 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.434 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.706 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 70f125d3-772c-4512-89cd-87864bebf8cc actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.707 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 9b3efeab-7379-4e78-8df8-032e6e66cd67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.707 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.708 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.820 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing inventories for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.864 187122 DEBUG nova.compute.manager [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-changed-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.865 187122 DEBUG nova.compute.manager [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing instance network info cache due to event network-changed-fe1e3b21-532c-47fd-89c8-481678f2454b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.865 187122 DEBUG oslo_concurrency.lockutils [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.865 187122 DEBUG oslo_concurrency.lockutils [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.865 187122 DEBUG nova.network.neutron [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing network info cache for port fe1e3b21-532c-47fd-89c8-481678f2454b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.930 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating ProviderTree inventory for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.931 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.950 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing aggregate associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 14:34:40 compute-0 nova_compute[187118]: 2025-11-24 14:34:40.992 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing trait associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, traits: COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AESNI,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.046 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.057 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.078 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.078 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.079 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.080 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.092 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:34:41 compute-0 nova_compute[187118]: 2025-11-24 14:34:41.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.021 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.303 187122 DEBUG nova.network.neutron [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updated VIF entry in instance network info cache for port fe1e3b21-532c-47fd-89c8-481678f2454b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.304 187122 DEBUG nova.network.neutron [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.321 187122 DEBUG oslo_concurrency.lockutils [req-ee06c227-3227-4991-ad4e-a9ad0d597940 req-c51dee38-9f1c-41f2-a90d-90e1ea329c53 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.322 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.323 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.323 187122 DEBUG nova.objects.instance [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:34:42 compute-0 nova_compute[187118]: 2025-11-24 14:34:42.511 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.904 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.930 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.930 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.931 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.931 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.931 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.931 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.932 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.932 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:34:43 compute-0 nova_compute[187118]: 2025-11-24 14:34:43.932 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 14:34:44 compute-0 podman[216436]: 2025-11-24 14:34:44.451486963 +0000 UTC m=+0.062201056 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 24 14:34:44 compute-0 nova_compute[187118]: 2025-11-24 14:34:44.852 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:47 compute-0 nova_compute[187118]: 2025-11-24 14:34:47.513 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:48 compute-0 podman[216457]: 2025-11-24 14:34:48.487238887 +0000 UTC m=+0.083236029 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:34:48 compute-0 podman[216458]: 2025-11-24 14:34:48.491812742 +0000 UTC m=+0.084507174 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:34:49 compute-0 nova_compute[187118]: 2025-11-24 14:34:49.853 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:52 compute-0 nova_compute[187118]: 2025-11-24 14:34:52.515 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:54 compute-0 podman[216500]: 2025-11-24 14:34:54.48345418 +0000 UTC m=+0.074255685 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, vcs-type=git, architecture=x86_64)
Nov 24 14:34:54 compute-0 podman[216499]: 2025-11-24 14:34:54.518125815 +0000 UTC m=+0.109212358 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 24 14:34:54 compute-0 nova_compute[187118]: 2025-11-24 14:34:54.855 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:56.661 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:34:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:56.662 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:34:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:34:56.663 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:34:57 compute-0 nova_compute[187118]: 2025-11-24 14:34:57.516 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:34:59 compute-0 nova_compute[187118]: 2025-11-24 14:34:59.858 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:00 compute-0 podman[216546]: 2025-11-24 14:35:00.492305098 +0000 UTC m=+0.090467226 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:35:02 compute-0 nova_compute[187118]: 2025-11-24 14:35:02.518 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:04 compute-0 nova_compute[187118]: 2025-11-24 14:35:04.861 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.332 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.333 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.334 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.334 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.334 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.337 187122 INFO nova.compute.manager [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Terminating instance
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.339 187122 DEBUG nova.compute.manager [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:35:06 compute-0 kernel: tap567cb66f-ac (unregistering): left promiscuous mode
Nov 24 14:35:06 compute-0 NetworkManager[55697]: <info>  [1763994906.3674] device (tap567cb66f-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.379 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 ovn_controller[95613]: 2025-11-24T14:35:06Z|00098|binding|INFO|Releasing lport 567cb66f-ac48-449e-accf-08c9a578a66c from this chassis (sb_readonly=0)
Nov 24 14:35:06 compute-0 ovn_controller[95613]: 2025-11-24T14:35:06Z|00099|binding|INFO|Setting lport 567cb66f-ac48-449e-accf-08c9a578a66c down in Southbound
Nov 24 14:35:06 compute-0 ovn_controller[95613]: 2025-11-24T14:35:06Z|00100|binding|INFO|Removing iface tap567cb66f-ac ovn-installed in OVS
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.397 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.411 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:0a:c9 10.100.0.25'], port_security=['fa:16:3e:85:0a:c9 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '9b3efeab-7379-4e78-8df8-032e6e66cd67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '03bfcc80-c0c3-4e20-9635-e8e21409c08e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cac3ec3-3f65-41d6-96cd-4c08dbf282d0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=567cb66f-ac48-449e-accf-08c9a578a66c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.414 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 567cb66f-ac48-449e-accf-08c9a578a66c in datapath ab0b30f6-b57a-4fe9-b7c2-d307773590ec unbound from our chassis
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.417 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ab0b30f6-b57a-4fe9-b7c2-d307773590ec
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.419 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.438 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[cd612c8d-bf39-4c95-a88d-4eafbd69d4e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:06 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Nov 24 14:35:06 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 14.039s CPU time.
Nov 24 14:35:06 compute-0 systemd-machined[153483]: Machine qemu-7-instance-00000007 terminated.
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.477 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[26402b93-592e-48ee-a2fb-8d9361061259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.481 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[b9fae2dd-d754-409b-90a0-ad6a58bf9e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.516 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[032cfde1-d36a-4d5f-9423-18896a9f1970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.538 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[17f0e3d2-2bf7-42d9-b09e-605bbaebee91]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapab0b30f6-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:66:db:85'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 832, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 311400, 'reachable_time': 28372, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 7, 'inoctets': 524, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 7, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 524, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 7, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216581, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:06 compute-0 kernel: tap567cb66f-ac: entered promiscuous mode
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.557 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdb3d4d-63b9-4a15-8a9b-8deb1799921e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapab0b30f6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 311412, 'tstamp': 311412}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216582, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapab0b30f6-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 311415, 'tstamp': 311415}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216582, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:06 compute-0 kernel: tap567cb66f-ac (unregistering): left promiscuous mode
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.560 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0b30f6-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:06 compute-0 NetworkManager[55697]: <info>  [1763994906.5626] manager: (tap567cb66f-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.562 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.569 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.581 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.582 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab0b30f6-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.582 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.583 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapab0b30f6-b0, col_values=(('external_ids', {'iface-id': '2a7bcd0d-ee04-4834-b94b-d2234a1d3740'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:06 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:06.584 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.605 187122 DEBUG nova.compute.manager [req-dcb24784-d64c-4dad-b384-b49038b93710 req-27c96417-1da5-44b6-a0c6-ef3967d53f03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-vif-unplugged-567cb66f-ac48-449e-accf-08c9a578a66c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.606 187122 DEBUG oslo_concurrency.lockutils [req-dcb24784-d64c-4dad-b384-b49038b93710 req-27c96417-1da5-44b6-a0c6-ef3967d53f03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.606 187122 DEBUG oslo_concurrency.lockutils [req-dcb24784-d64c-4dad-b384-b49038b93710 req-27c96417-1da5-44b6-a0c6-ef3967d53f03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.606 187122 DEBUG oslo_concurrency.lockutils [req-dcb24784-d64c-4dad-b384-b49038b93710 req-27c96417-1da5-44b6-a0c6-ef3967d53f03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.606 187122 DEBUG nova.compute.manager [req-dcb24784-d64c-4dad-b384-b49038b93710 req-27c96417-1da5-44b6-a0c6-ef3967d53f03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] No waiting events found dispatching network-vif-unplugged-567cb66f-ac48-449e-accf-08c9a578a66c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.606 187122 DEBUG nova.compute.manager [req-dcb24784-d64c-4dad-b384-b49038b93710 req-27c96417-1da5-44b6-a0c6-ef3967d53f03 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-vif-unplugged-567cb66f-ac48-449e-accf-08c9a578a66c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.611 187122 INFO nova.virt.libvirt.driver [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Instance destroyed successfully.
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.611 187122 DEBUG nova.objects.instance [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 9b3efeab-7379-4e78-8df8-032e6e66cd67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.621 187122 DEBUG nova.virt.libvirt.vif [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:34:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-341495495',display_name='tempest-TestNetworkBasicOps-server-341495495',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-341495495',id=7,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOZkxL5wi9Q/u/qvNDr2KmRfqGzm1BMXjeAzoCz5U2QtADvT2sbNAlUmFRNErLKp6Wu4BBlqdVbN2TTC5tMIEqal5FJUQiOsx1yeNac03Gqj1PJHUBfugrNfq2yRaju5gA==',key_name='tempest-TestNetworkBasicOps-471487238',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:34:18Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-3h48y00q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:34:18Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9b3efeab-7379-4e78-8df8-032e6e66cd67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.621 187122 DEBUG nova.network.os_vif_util [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "567cb66f-ac48-449e-accf-08c9a578a66c", "address": "fa:16:3e:85:0a:c9", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap567cb66f-ac", "ovs_interfaceid": "567cb66f-ac48-449e-accf-08c9a578a66c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.622 187122 DEBUG nova.network.os_vif_util [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.622 187122 DEBUG os_vif [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.624 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.624 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap567cb66f-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.625 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.627 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.630 187122 INFO os_vif [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=567cb66f-ac48-449e-accf-08c9a578a66c,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap567cb66f-ac')
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.630 187122 INFO nova.virt.libvirt.driver [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Deleting instance files /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67_del
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.631 187122 INFO nova.virt.libvirt.driver [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Deletion of /var/lib/nova/instances/9b3efeab-7379-4e78-8df8-032e6e66cd67_del complete
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.669 187122 INFO nova.compute.manager [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.670 187122 DEBUG oslo.service.loopingcall [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.670 187122 DEBUG nova.compute.manager [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:35:06 compute-0 nova_compute[187118]: 2025-11-24 14:35:06.671 187122 DEBUG nova.network.neutron [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:35:07 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:07.731 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:35:07 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:07.733 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.734 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.750 187122 DEBUG nova.network.neutron [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.767 187122 INFO nova.compute.manager [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Took 1.10 seconds to deallocate network for instance.
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.816 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.816 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.832 187122 DEBUG nova.compute.manager [req-2cc8a7a9-eedb-4641-af05-dd61f65d3fd1 req-821b4f72-7f56-4fb3-80a5-a531fa92032d 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-vif-deleted-567cb66f-ac48-449e-accf-08c9a578a66c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.888 187122 DEBUG nova.compute.provider_tree [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.902 187122 DEBUG nova.scheduler.client.report [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.923 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:07 compute-0 nova_compute[187118]: 2025-11-24 14:35:07.949 187122 INFO nova.scheduler.client.report [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 9b3efeab-7379-4e78-8df8-032e6e66cd67
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.003 187122 DEBUG oslo_concurrency.lockutils [None req-5ec2e3ea-b4e6-48e3-8fbd-40727b6fee61 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.703 187122 DEBUG nova.compute.manager [req-9d8df766-bbdf-468d-803b-0c5de22c0b61 req-3e768e5b-cd13-4441-8acb-a93050f22b2f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.703 187122 DEBUG oslo_concurrency.lockutils [req-9d8df766-bbdf-468d-803b-0c5de22c0b61 req-3e768e5b-cd13-4441-8acb-a93050f22b2f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.704 187122 DEBUG oslo_concurrency.lockutils [req-9d8df766-bbdf-468d-803b-0c5de22c0b61 req-3e768e5b-cd13-4441-8acb-a93050f22b2f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.704 187122 DEBUG oslo_concurrency.lockutils [req-9d8df766-bbdf-468d-803b-0c5de22c0b61 req-3e768e5b-cd13-4441-8acb-a93050f22b2f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9b3efeab-7379-4e78-8df8-032e6e66cd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.704 187122 DEBUG nova.compute.manager [req-9d8df766-bbdf-468d-803b-0c5de22c0b61 req-3e768e5b-cd13-4441-8acb-a93050f22b2f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] No waiting events found dispatching network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:08 compute-0 nova_compute[187118]: 2025-11-24 14:35:08.704 187122 WARNING nova.compute.manager [req-9d8df766-bbdf-468d-803b-0c5de22c0b61 req-3e768e5b-cd13-4441-8acb-a93050f22b2f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Received unexpected event network-vif-plugged-567cb66f-ac48-449e-accf-08c9a578a66c for instance with vm_state deleted and task_state None.
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.456 187122 DEBUG oslo_concurrency.lockutils [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "interface-70f125d3-772c-4512-89cd-87864bebf8cc-fe1e3b21-532c-47fd-89c8-481678f2454b" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.456 187122 DEBUG oslo_concurrency.lockutils [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-70f125d3-772c-4512-89cd-87864bebf8cc-fe1e3b21-532c-47fd-89c8-481678f2454b" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.466 187122 DEBUG nova.objects.instance [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'flavor' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.485 187122 DEBUG nova.virt.libvirt.vif [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.486 187122 DEBUG nova.network.os_vif_util [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.486 187122 DEBUG nova.network.os_vif_util [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.489 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.492 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.494 187122 DEBUG nova.virt.libvirt.driver [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Attempting to detach device tapfe1e3b21-53 from instance 70f125d3-772c-4512-89cd-87864bebf8cc from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.494 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] detach device xml: <interface type="ethernet">
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <mac address="fa:16:3e:f5:6e:76"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <model type="virtio"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <mtu size="1442"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <target dev="tapfe1e3b21-53"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </interface>
Nov 24 14:35:09 compute-0 nova_compute[187118]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.502 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.506 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <name>instance-00000006</name>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <uuid>70f125d3-772c-4512-89cd-87864bebf8cc</uuid>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:34:06</nova:creationTime>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:port uuid="fe1e3b21-532c-47fd-89c8-481678f2454b">
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <memory unit='KiB'>131072</memory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <vcpu placement='static'>1</vcpu>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <resource>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <partition>/machine</partition>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </resource>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <sysinfo type='smbios'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <system>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='manufacturer'>RDO</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='serial'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='uuid'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='family'>Virtual Machine</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </system>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <os>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <boot dev='hd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <smbios mode='sysinfo'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </os>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <features>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <vmcoreinfo state='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </features>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <vendor>AMD</vendor>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='x2apic'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='hypervisor'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='stibp'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='ssbd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='overflow-recov'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='succor'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='ibrs'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='lbrv'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='pause-filter'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='xsaves'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='svm'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='topoext'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='npt'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='nrip-save'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <clock offset='utc'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <timer name='hpet' present='no'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <on_poweroff>destroy</on_poweroff>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <on_reboot>restart</on_reboot>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <on_crash>destroy</on_crash>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <disk type='file' device='disk'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk' index='2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <backingStore type='file' index='3'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <format type='raw'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <source file='/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <backingStore/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       </backingStore>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='vda' bus='virtio'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='virtio-disk0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <disk type='file' device='cdrom'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config' index='1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <backingStore/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='sda' bus='sata'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <readonly/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='sata0-0-0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pcie.0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='1' port='0x10'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='2' port='0x11'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='3' port='0x12'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='4' port='0x13'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='5' port='0x14'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='6' port='0x15'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='7' port='0x16'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='8' port='0x17'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.8'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='9' port='0x18'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.9'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='10' port='0x19'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.10'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='11' port='0x1a'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.11'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='12' port='0x1b'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.12'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='13' port='0x1c'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.13'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='14' port='0x1d'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.14'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='15' port='0x1e'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.15'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='16' port='0x1f'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.16'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='17' port='0x20'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.17'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='18' port='0x21'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.18'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='19' port='0x22'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.19'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='20' port='0x23'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.20'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='21' port='0x24'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.21'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='22' port='0x25'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.22'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='23' port='0x26'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.23'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='24' port='0x27'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.24'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='25' port='0x28'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.25'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-pci-bridge'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.26'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='usb'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='sata' index='0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='ide'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:74:5b:4f'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='tap80657a89-07'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='net0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:f5:6e:76'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='tapfe1e3b21-53'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='net1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <serial type='pty'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target type='isa-serial' port='0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <model name='isa-serial'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       </target>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target type='serial' port='0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </console>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <input type='tablet' bus='usb'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='input0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='usb' bus='0' port='1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <input type='mouse' bus='ps2'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='input1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <input type='keyboard' bus='ps2'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='input2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <listen type='address' address='::0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <audio id='1' type='none'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <video>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='video0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </video>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <watchdog model='itco' action='reset'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='watchdog0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </watchdog>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <memballoon model='virtio'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <stats period='10'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='balloon0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <rng model='virtio'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <backend model='random'>/dev/urandom</backend>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='rng0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <label>system_u:system_r:svirt_t:s0:c678,c836</label>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c678,c836</imagelabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <label>+107:+107</label>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <imagelabel>+107:+107</imagelabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </domain>
Nov 24 14:35:09 compute-0 nova_compute[187118]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.506 187122 INFO nova.virt.libvirt.driver [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully detached device tapfe1e3b21-53 from instance 70f125d3-772c-4512-89cd-87864bebf8cc from the persistent domain config.
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.506 187122 DEBUG nova.virt.libvirt.driver [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] (1/8): Attempting to detach device tapfe1e3b21-53 with device alias net1 from instance 70f125d3-772c-4512-89cd-87864bebf8cc from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.506 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] detach device xml: <interface type="ethernet">
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <mac address="fa:16:3e:f5:6e:76"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <model type="virtio"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <mtu size="1442"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <target dev="tapfe1e3b21-53"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </interface>
Nov 24 14:35:09 compute-0 nova_compute[187118]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Nov 24 14:35:09 compute-0 kernel: tapfe1e3b21-53 (unregistering): left promiscuous mode
Nov 24 14:35:09 compute-0 NetworkManager[55697]: <info>  [1763994909.6267] device (tapfe1e3b21-53): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.630 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 ovn_controller[95613]: 2025-11-24T14:35:09Z|00101|binding|INFO|Releasing lport fe1e3b21-532c-47fd-89c8-481678f2454b from this chassis (sb_readonly=0)
Nov 24 14:35:09 compute-0 ovn_controller[95613]: 2025-11-24T14:35:09Z|00102|binding|INFO|Setting lport fe1e3b21-532c-47fd-89c8-481678f2454b down in Southbound
Nov 24 14:35:09 compute-0 ovn_controller[95613]: 2025-11-24T14:35:09Z|00103|binding|INFO|Removing iface tapfe1e3b21-53 ovn-installed in OVS
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.632 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.633 187122 DEBUG nova.virt.libvirt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Received event <DeviceRemovedEvent: 1763994909.6337187, 70f125d3-772c-4512-89cd-87864bebf8cc => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.636 187122 DEBUG nova.virt.libvirt.driver [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Start waiting for the detach event from libvirt for device tapfe1e3b21-53 with device alias net1 for instance 70f125d3-772c-4512-89cd-87864bebf8cc _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.637 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.638 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f5:6e:76 10.100.0.21', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.21/28', 'neutron:device_id': '70f125d3-772c-4512-89cd-87864bebf8cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2cac3ec3-3f65-41d6-96cd-4c08dbf282d0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=fe1e3b21-532c-47fd-89c8-481678f2454b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.639 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <name>instance-00000006</name>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <uuid>70f125d3-772c-4512-89cd-87864bebf8cc</uuid>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:34:06</nova:creationTime>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:port uuid="fe1e3b21-532c-47fd-89c8-481678f2454b">
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.21" ipVersion="4"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <memory unit='KiB'>131072</memory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <vcpu placement='static'>1</vcpu>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <resource>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <partition>/machine</partition>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </resource>
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.640 104469 INFO neutron.agent.ovn.metadata.agent [-] Port fe1e3b21-532c-47fd-89c8-481678f2454b in datapath ab0b30f6-b57a-4fe9-b7c2-d307773590ec unbound from our chassis
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <sysinfo type='smbios'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <system>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='manufacturer'>RDO</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='serial'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='uuid'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <entry name='family'>Virtual Machine</entry>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </system>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <os>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <boot dev='hd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <smbios mode='sysinfo'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </os>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <features>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <vmcoreinfo state='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </features>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <vendor>AMD</vendor>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='x2apic'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='hypervisor'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='stibp'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='ssbd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='overflow-recov'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='succor'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='ibrs'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='lbrv'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='pause-filter'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='xsaves'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='svm'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='require' name='topoext'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='npt'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <feature policy='disable' name='nrip-save'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <clock offset='utc'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <timer name='hpet' present='no'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <on_poweroff>destroy</on_poweroff>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <on_reboot>restart</on_reboot>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <on_crash>destroy</on_crash>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <disk type='file' device='disk'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk' index='2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <backingStore type='file' index='3'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <format type='raw'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <source file='/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <backingStore/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       </backingStore>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='vda' bus='virtio'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='virtio-disk0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <disk type='file' device='cdrom'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config' index='1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <backingStore/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='sda' bus='sata'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <readonly/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='sata0-0-0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pcie.0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='1' port='0x10'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='2' port='0x11'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='3' port='0x12'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='4' port='0x13'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='5' port='0x14'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='6' port='0x15'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='7' port='0x16'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='8' port='0x17'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.8'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='9' port='0x18'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.9'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='10' port='0x19'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.10'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='11' port='0x1a'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.11'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='12' port='0x1b'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.12'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='13' port='0x1c'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.13'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='14' port='0x1d'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.14'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='15' port='0x1e'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.15'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='16' port='0x1f'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.16'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='17' port='0x20'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.17'/>
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.642 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ab0b30f6-b57a-4fe9-b7c2-d307773590ec, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='18' port='0x21'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.18'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='19' port='0x22'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.19'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='20' port='0x23'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.20'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='21' port='0x24'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.21'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='22' port='0x25'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.22'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='23' port='0x26'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.23'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='24' port='0x27'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.24'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target chassis='25' port='0x28'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.25'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model name='pcie-pci-bridge'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='pci.26'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='usb'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <controller type='sata' index='0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='ide'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:74:5b:4f'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target dev='tap80657a89-07'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='net0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <serial type='pty'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target type='isa-serial' port='0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:         <model name='isa-serial'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       </target>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <target type='serial' port='0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </console>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <input type='tablet' bus='usb'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='input0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='usb' bus='0' port='1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <input type='mouse' bus='ps2'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='input1'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <input type='keyboard' bus='ps2'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='input2'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <listen type='address' address='::0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <audio id='1' type='none'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <video>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='video0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </video>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <watchdog model='itco' action='reset'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='watchdog0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </watchdog>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <memballoon model='virtio'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <stats period='10'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='balloon0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <rng model='virtio'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <backend model='random'>/dev/urandom</backend>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <alias name='rng0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <label>system_u:system_r:svirt_t:s0:c678,c836</label>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c678,c836</imagelabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <label>+107:+107</label>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <imagelabel>+107:+107</imagelabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </domain>
Nov 24 14:35:09 compute-0 nova_compute[187118]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.640 187122 INFO nova.virt.libvirt.driver [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully detached device tapfe1e3b21-53 from instance 70f125d3-772c-4512-89cd-87864bebf8cc from the live domain config.
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.640 187122 DEBUG nova.virt.libvirt.vif [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.641 187122 DEBUG nova.network.os_vif_util [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.641 187122 DEBUG nova.network.os_vif_util [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.641 187122 DEBUG os_vif [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.643 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.643 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe1e3b21-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.643 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[110e03c0-db15-4ca5-8854-ecdf88535650]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.644 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec namespace which is not needed anymore
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.646 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.648 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.659 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.666 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.668 187122 INFO os_vif [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53')
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.669 187122 DEBUG nova.virt.libvirt.guest [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:35:09</nova:creationTime>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:35:09 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:35:09 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:09 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:35:09 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:35:09 compute-0 nova_compute[187118]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 14:35:09 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [NOTICE]   (216171) : haproxy version is 2.8.14-c23fe91
Nov 24 14:35:09 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [NOTICE]   (216171) : path to executable is /usr/sbin/haproxy
Nov 24 14:35:09 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [WARNING]  (216171) : Exiting Master process...
Nov 24 14:35:09 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [WARNING]  (216171) : Exiting Master process...
Nov 24 14:35:09 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [ALERT]    (216171) : Current worker (216173) exited with code 143 (Terminated)
Nov 24 14:35:09 compute-0 neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec[216167]: [WARNING]  (216171) : All workers exited. Exiting... (0)
Nov 24 14:35:09 compute-0 systemd[1]: libpod-f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a.scope: Deactivated successfully.
Nov 24 14:35:09 compute-0 podman[216623]: 2025-11-24 14:35:09.788185681 +0000 UTC m=+0.051448283 container died f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118)
Nov 24 14:35:09 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a-userdata-shm.mount: Deactivated successfully.
Nov 24 14:35:09 compute-0 systemd[1]: var-lib-containers-storage-overlay-3e657a38b0bbc3b680db9fa820f06b3bc61f5c28c379c1688570d47e734ae638-merged.mount: Deactivated successfully.
Nov 24 14:35:09 compute-0 podman[216623]: 2025-11-24 14:35:09.840678633 +0000 UTC m=+0.103941225 container cleanup f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:35:09 compute-0 systemd[1]: libpod-conmon-f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a.scope: Deactivated successfully.
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.864 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 podman[216654]: 2025-11-24 14:35:09.909397335 +0000 UTC m=+0.046328663 container remove f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.914 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4e346dd3-193d-43d8-99b2-c39077a6c631]: (4, ('Mon Nov 24 02:35:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec (f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a)\nf4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a\nMon Nov 24 02:35:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec (f4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a)\nf4120d8d64555d156ceab409f26fa19cbee86c427425f86993f0c3b3a9921c1a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.916 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f2099827-c1c9-4140-89ce-39a321dee3bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.917 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0b30f6-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.919 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 kernel: tapab0b30f6-b0: left promiscuous mode
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.924 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8afe8b98-aafb-45f8-8169-560f7c3ef699]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 nova_compute[187118]: 2025-11-24 14:35:09.939 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.945 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ce25a8dd-0a4f-4970-9fcd-c59666f6e32a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.946 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[64245f80-ad03-41f5-a40c-fc753388268a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.967 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e52707fe-d746-43ea-a559-0a1cd8bde3cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 311395, 'reachable_time': 19116, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216671, 'error': None, 'target': 'ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.969 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ab0b30f6-b57a-4fe9-b7c2-d307773590ec deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:35:09 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:09.969 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[70d375a6-8d01-4739-9438-d9878a40ab5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:09 compute-0 systemd[1]: run-netns-ovnmeta\x2dab0b30f6\x2db57a\x2d4fe9\x2db7c2\x2dd307773590ec.mount: Deactivated successfully.
Nov 24 14:35:10 compute-0 podman[216670]: 2025-11-24 14:35:10.066051305 +0000 UTC m=+0.082973272 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:35:10 compute-0 nova_compute[187118]: 2025-11-24 14:35:10.406 187122 DEBUG nova.compute.manager [req-0670fea1-1cc6-49bb-b014-a5c36b9a6f66 req-f39628eb-f9de-4deb-a01b-122a0d079196 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-unplugged-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:10 compute-0 nova_compute[187118]: 2025-11-24 14:35:10.406 187122 DEBUG oslo_concurrency.lockutils [req-0670fea1-1cc6-49bb-b014-a5c36b9a6f66 req-f39628eb-f9de-4deb-a01b-122a0d079196 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:10 compute-0 nova_compute[187118]: 2025-11-24 14:35:10.407 187122 DEBUG oslo_concurrency.lockutils [req-0670fea1-1cc6-49bb-b014-a5c36b9a6f66 req-f39628eb-f9de-4deb-a01b-122a0d079196 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:10 compute-0 nova_compute[187118]: 2025-11-24 14:35:10.407 187122 DEBUG oslo_concurrency.lockutils [req-0670fea1-1cc6-49bb-b014-a5c36b9a6f66 req-f39628eb-f9de-4deb-a01b-122a0d079196 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:10 compute-0 nova_compute[187118]: 2025-11-24 14:35:10.407 187122 DEBUG nova.compute.manager [req-0670fea1-1cc6-49bb-b014-a5c36b9a6f66 req-f39628eb-f9de-4deb-a01b-122a0d079196 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-unplugged-fe1e3b21-532c-47fd-89c8-481678f2454b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:10 compute-0 nova_compute[187118]: 2025-11-24 14:35:10.407 187122 WARNING nova.compute.manager [req-0670fea1-1cc6-49bb-b014-a5c36b9a6f66 req-f39628eb-f9de-4deb-a01b-122a0d079196 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-unplugged-fe1e3b21-532c-47fd-89c8-481678f2454b for instance with vm_state active and task_state None.
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.359 187122 DEBUG oslo_concurrency.lockutils [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.359 187122 DEBUG oslo_concurrency.lockutils [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.360 187122 DEBUG nova.network.neutron [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.450 187122 DEBUG nova.compute.manager [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-deleted-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.450 187122 INFO nova.compute.manager [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Neutron deleted interface fe1e3b21-532c-47fd-89c8-481678f2454b; detaching it from the instance and deleting it from the info cache
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.450 187122 DEBUG nova.network.neutron [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.470 187122 DEBUG nova.objects.instance [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lazy-loading 'system_metadata' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.511 187122 DEBUG nova.objects.instance [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lazy-loading 'flavor' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.550 187122 DEBUG nova.virt.libvirt.vif [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.551 187122 DEBUG nova.network.os_vif_util [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Converting VIF {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.553 187122 DEBUG nova.network.os_vif_util [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.558 187122 DEBUG nova.virt.libvirt.guest [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.564 187122 DEBUG nova.virt.libvirt.guest [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <name>instance-00000006</name>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <uuid>70f125d3-772c-4512-89cd-87864bebf8cc</uuid>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:35:09</nova:creationTime>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:35:11 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <memory unit='KiB'>131072</memory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <vcpu placement='static'>1</vcpu>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <resource>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <partition>/machine</partition>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </resource>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <sysinfo type='smbios'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <system>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='manufacturer'>RDO</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='serial'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='uuid'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='family'>Virtual Machine</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </system>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <os>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <boot dev='hd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <smbios mode='sysinfo'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </os>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <features>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <vmcoreinfo state='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </features>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <vendor>AMD</vendor>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='x2apic'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='hypervisor'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='stibp'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='ssbd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='overflow-recov'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='succor'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='ibrs'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='lbrv'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='pause-filter'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='xsaves'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='svm'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='topoext'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='npt'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='nrip-save'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <clock offset='utc'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <timer name='hpet' present='no'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <on_poweroff>destroy</on_poweroff>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <on_reboot>restart</on_reboot>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <on_crash>destroy</on_crash>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <disk type='file' device='disk'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk' index='2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <backingStore type='file' index='3'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <format type='raw'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <source file='/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <backingStore/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       </backingStore>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target dev='vda' bus='virtio'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='virtio-disk0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <disk type='file' device='cdrom'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config' index='1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <backingStore/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target dev='sda' bus='sata'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <readonly/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='sata0-0-0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pcie.0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='1' port='0x10'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='2' port='0x11'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='3' port='0x12'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='4' port='0x13'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='5' port='0x14'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='6' port='0x15'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='7' port='0x16'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='8' port='0x17'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.8'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='9' port='0x18'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.9'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='10' port='0x19'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.10'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='11' port='0x1a'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.11'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='12' port='0x1b'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.12'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='13' port='0x1c'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.13'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='14' port='0x1d'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.14'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='15' port='0x1e'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.15'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='16' port='0x1f'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.16'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='17' port='0x20'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.17'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='18' port='0x21'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.18'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='19' port='0x22'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.19'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='20' port='0x23'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.20'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='21' port='0x24'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.21'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='22' port='0x25'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.22'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='23' port='0x26'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.23'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='24' port='0x27'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.24'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='25' port='0x28'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.25'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-pci-bridge'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.26'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='usb'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='sata' index='0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='ide'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:74:5b:4f'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target dev='tap80657a89-07'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='net0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <serial type='pty'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target type='isa-serial' port='0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <model name='isa-serial'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       </target>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target type='serial' port='0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </console>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <input type='tablet' bus='usb'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='input0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='usb' bus='0' port='1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <input type='mouse' bus='ps2'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='input1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <input type='keyboard' bus='ps2'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='input2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <listen type='address' address='::0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <audio id='1' type='none'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <video>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='video0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </video>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <watchdog model='itco' action='reset'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='watchdog0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </watchdog>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <memballoon model='virtio'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <stats period='10'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='balloon0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <rng model='virtio'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <backend model='random'>/dev/urandom</backend>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='rng0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <label>system_u:system_r:svirt_t:s0:c678,c836</label>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c678,c836</imagelabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <label>+107:+107</label>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <imagelabel>+107:+107</imagelabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]: </domain>
Nov 24 14:35:11 compute-0 nova_compute[187118]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.565 187122 DEBUG nova.virt.libvirt.guest [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.570 187122 DEBUG nova.virt.libvirt.guest [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:f5:6e:76"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapfe1e3b21-53"/></interface>not found in domain: <domain type='kvm' id='6'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <name>instance-00000006</name>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <uuid>70f125d3-772c-4512-89cd-87864bebf8cc</uuid>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:35:09</nova:creationTime>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:35:11 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <memory unit='KiB'>131072</memory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <currentMemory unit='KiB'>131072</currentMemory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <vcpu placement='static'>1</vcpu>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <resource>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <partition>/machine</partition>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </resource>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <sysinfo type='smbios'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <system>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='manufacturer'>RDO</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='product'>OpenStack Compute</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='serial'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='uuid'>70f125d3-772c-4512-89cd-87864bebf8cc</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <entry name='family'>Virtual Machine</entry>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </system>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <os>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <boot dev='hd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <smbios mode='sysinfo'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </os>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <features>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <vmcoreinfo state='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </features>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <cpu mode='custom' match='exact' check='full'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <model fallback='forbid'>EPYC-Rome</model>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <vendor>AMD</vendor>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='x2apic'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc-deadline'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='hypervisor'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='tsc_adjust'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='spec-ctrl'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='stibp'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='ssbd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='cmp_legacy'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='overflow-recov'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='succor'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='ibrs'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='amd-ssbd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='virt-ssbd'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='lbrv'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='tsc-scale'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='vmcb-clean'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='flushbyasid'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='pause-filter'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='pfthreshold'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='svme-addr-chk'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='lfence-always-serializing'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='xsaves'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='svm'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='require' name='topoext'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='npt'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <feature policy='disable' name='nrip-save'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <clock offset='utc'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <timer name='pit' tickpolicy='delay'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <timer name='rtc' tickpolicy='catchup'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <timer name='hpet' present='no'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <on_poweroff>destroy</on_poweroff>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <on_reboot>restart</on_reboot>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <on_crash>destroy</on_crash>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <disk type='file' device='disk'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <driver name='qemu' type='qcow2' cache='none'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk' index='2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <backingStore type='file' index='3'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <format type='raw'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <source file='/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <backingStore/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       </backingStore>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target dev='vda' bus='virtio'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='virtio-disk0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <disk type='file' device='cdrom'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <driver name='qemu' type='raw' cache='none'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/disk.config' index='1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <backingStore/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target dev='sda' bus='sata'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <readonly/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='sata0-0-0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='0' model='pcie-root'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pcie.0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='1' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='1' port='0x10'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='2' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='2' port='0x11'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='3' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='3' port='0x12'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='4' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='4' port='0x13'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='5' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='5' port='0x14'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='6' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='6' port='0x15'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='7' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='7' port='0x16'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='8' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='8' port='0x17'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.8'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='9' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='9' port='0x18'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.9'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='10' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='10' port='0x19'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.10'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='11' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='11' port='0x1a'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.11'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='12' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='12' port='0x1b'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.12'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='13' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='13' port='0x1c'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.13'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='14' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='14' port='0x1d'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.14'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='15' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='15' port='0x1e'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.15'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='16' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='16' port='0x1f'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.16'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='17' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='17' port='0x20'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.17'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='18' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='18' port='0x21'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.18'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='19' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='19' port='0x22'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.19'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='20' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='20' port='0x23'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.20'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='21' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='21' port='0x24'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.21'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='22' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='22' port='0x25'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.22'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='23' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='23' port='0x26'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.23'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='24' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='24' port='0x27'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.24'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='25' model='pcie-root-port'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-root-port'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target chassis='25' port='0x28'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.25'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model name='pcie-pci-bridge'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='pci.26'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='usb' index='0' model='piix3-uhci'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='usb'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <controller type='sata' index='0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='ide'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </controller>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <interface type='ethernet'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <mac address='fa:16:3e:74:5b:4f'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target dev='tap80657a89-07'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model type='virtio'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <driver name='vhost' rx_queue_size='512'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <mtu size='1442'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='net0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <serial type='pty'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target type='isa-serial' port='0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:         <model name='isa-serial'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       </target>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <console type='pty' tty='/dev/pts/0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <source path='/dev/pts/0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <log file='/var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc/console.log' append='off'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <target type='serial' port='0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='serial0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </console>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <input type='tablet' bus='usb'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='input0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='usb' bus='0' port='1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <input type='mouse' bus='ps2'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='input1'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <input type='keyboard' bus='ps2'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='input2'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </input>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <listen type='address' address='::0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </graphics>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <audio id='1' type='none'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <video>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <model type='virtio' heads='1' primary='yes'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='video0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </video>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <watchdog model='itco' action='reset'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='watchdog0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </watchdog>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <memballoon model='virtio'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <stats period='10'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='balloon0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <rng model='virtio'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <backend model='random'>/dev/urandom</backend>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <alias name='rng0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <label>system_u:system_r:svirt_t:s0:c678,c836</label>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c678,c836</imagelabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <label>+107:+107</label>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <imagelabel>+107:+107</imagelabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </seclabel>
Nov 24 14:35:11 compute-0 nova_compute[187118]: </domain>
Nov 24 14:35:11 compute-0 nova_compute[187118]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.571 187122 WARNING nova.virt.libvirt.driver [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Detaching interface fa:16:3e:f5:6e:76 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapfe1e3b21-53' not found.
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.572 187122 DEBUG nova.virt.libvirt.vif [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.572 187122 DEBUG nova.network.os_vif_util [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Converting VIF {"id": "fe1e3b21-532c-47fd-89c8-481678f2454b", "address": "fa:16:3e:f5:6e:76", "network": {"id": "ab0b30f6-b57a-4fe9-b7c2-d307773590ec", "bridge": "br-int", "label": "tempest-network-smoke--805264650", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.21", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe1e3b21-53", "ovs_interfaceid": "fe1e3b21-532c-47fd-89c8-481678f2454b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.573 187122 DEBUG nova.network.os_vif_util [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.574 187122 DEBUG os_vif [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.576 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.576 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe1e3b21-53, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.577 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.579 187122 INFO os_vif [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f5:6e:76,bridge_name='br-int',has_traffic_filtering=True,id=fe1e3b21-532c-47fd-89c8-481678f2454b,network=Network(ab0b30f6-b57a-4fe9-b7c2-d307773590ec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe1e3b21-53')
Nov 24 14:35:11 compute-0 nova_compute[187118]: 2025-11-24 14:35:11.580 187122 DEBUG nova.virt.libvirt.guest [req-18bea714-1715-43df-b4c6-d425e35decda req-285a9fda-fe1e-4a32-8d20-040f12cc7c44 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:name>tempest-TestNetworkBasicOps-server-814470289</nova:name>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:creationTime>2025-11-24 14:35:11</nova:creationTime>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:flavor name="m1.nano">
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:memory>128</nova:memory>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:disk>1</nova:disk>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:swap>0</nova:swap>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:flavor>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:owner>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:owner>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   <nova:ports>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     <nova:port uuid="80657a89-07d8-4355-a80e-f13874579df8">
Nov 24 14:35:11 compute-0 nova_compute[187118]:       <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:35:11 compute-0 nova_compute[187118]:     </nova:port>
Nov 24 14:35:11 compute-0 nova_compute[187118]:   </nova:ports>
Nov 24 14:35:11 compute-0 nova_compute[187118]: </nova:instance>
Nov 24 14:35:11 compute-0 nova_compute[187118]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Nov 24 14:35:12 compute-0 nova_compute[187118]: 2025-11-24 14:35:12.494 187122 DEBUG nova.compute.manager [req-023f36db-e9a3-4e28-9453-1a8b53af1360 req-12f700c1-26cf-4e31-85ab-b11abd81b882 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:12 compute-0 nova_compute[187118]: 2025-11-24 14:35:12.494 187122 DEBUG oslo_concurrency.lockutils [req-023f36db-e9a3-4e28-9453-1a8b53af1360 req-12f700c1-26cf-4e31-85ab-b11abd81b882 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:12 compute-0 nova_compute[187118]: 2025-11-24 14:35:12.494 187122 DEBUG oslo_concurrency.lockutils [req-023f36db-e9a3-4e28-9453-1a8b53af1360 req-12f700c1-26cf-4e31-85ab-b11abd81b882 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:12 compute-0 nova_compute[187118]: 2025-11-24 14:35:12.494 187122 DEBUG oslo_concurrency.lockutils [req-023f36db-e9a3-4e28-9453-1a8b53af1360 req-12f700c1-26cf-4e31-85ab-b11abd81b882 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:12 compute-0 nova_compute[187118]: 2025-11-24 14:35:12.494 187122 DEBUG nova.compute.manager [req-023f36db-e9a3-4e28-9453-1a8b53af1360 req-12f700c1-26cf-4e31-85ab-b11abd81b882 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:12 compute-0 nova_compute[187118]: 2025-11-24 14:35:12.495 187122 WARNING nova.compute.manager [req-023f36db-e9a3-4e28-9453-1a8b53af1360 req-12f700c1-26cf-4e31-85ab-b11abd81b882 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-plugged-fe1e3b21-532c-47fd-89c8-481678f2454b for instance with vm_state active and task_state None.
Nov 24 14:35:12 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:12.736 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:13 compute-0 ovn_controller[95613]: 2025-11-24T14:35:13Z|00104|binding|INFO|Releasing lport f0d428f1-79c6-415d-9945-8d2b6b384323 from this chassis (sb_readonly=0)
Nov 24 14:35:13 compute-0 nova_compute[187118]: 2025-11-24 14:35:13.587 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:14 compute-0 nova_compute[187118]: 2025-11-24 14:35:14.504 187122 INFO nova.network.neutron [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Port fe1e3b21-532c-47fd-89c8-481678f2454b from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 24 14:35:14 compute-0 nova_compute[187118]: 2025-11-24 14:35:14.504 187122 DEBUG nova.network.neutron [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:14 compute-0 nova_compute[187118]: 2025-11-24 14:35:14.524 187122 DEBUG oslo_concurrency.lockutils [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:35:14 compute-0 nova_compute[187118]: 2025-11-24 14:35:14.554 187122 DEBUG oslo_concurrency.lockutils [None req-88cf6188-4e39-46da-b029-266972d7ebf2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "interface-70f125d3-772c-4512-89cd-87864bebf8cc-fe1e3b21-532c-47fd-89c8-481678f2454b" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:14 compute-0 nova_compute[187118]: 2025-11-24 14:35:14.647 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:14 compute-0 nova_compute[187118]: 2025-11-24 14:35:14.865 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.414 187122 DEBUG nova.compute.manager [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-changed-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.415 187122 DEBUG nova.compute.manager [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing instance network info cache due to event network-changed-80657a89-07d8-4355-a80e-f13874579df8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.415 187122 DEBUG oslo_concurrency.lockutils [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.415 187122 DEBUG oslo_concurrency.lockutils [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.416 187122 DEBUG nova.network.neutron [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Refreshing network info cache for port 80657a89-07d8-4355-a80e-f13874579df8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:35:15 compute-0 podman[216693]: 2025-11-24 14:35:15.463097636 +0000 UTC m=+0.064933610 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.521 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.522 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.522 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.522 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.523 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.525 187122 INFO nova.compute.manager [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Terminating instance
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.527 187122 DEBUG nova.compute.manager [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:35:15 compute-0 kernel: tap80657a89-07 (unregistering): left promiscuous mode
Nov 24 14:35:15 compute-0 NetworkManager[55697]: <info>  [1763994915.5538] device (tap80657a89-07): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:35:15 compute-0 ovn_controller[95613]: 2025-11-24T14:35:15Z|00105|binding|INFO|Releasing lport 80657a89-07d8-4355-a80e-f13874579df8 from this chassis (sb_readonly=0)
Nov 24 14:35:15 compute-0 ovn_controller[95613]: 2025-11-24T14:35:15Z|00106|binding|INFO|Setting lport 80657a89-07d8-4355-a80e-f13874579df8 down in Southbound
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.561 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 ovn_controller[95613]: 2025-11-24T14:35:15Z|00107|binding|INFO|Removing iface tap80657a89-07 ovn-installed in OVS
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.564 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.579 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:5b:4f 10.100.0.5'], port_security=['fa:16:3e:74:5b:4f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '70f125d3-772c-4512-89cd-87864bebf8cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '006ec4d8-4baf-4197-8d42-e48ef06fa486', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=404425a2-d90a-4f58-8342-049369e4c90c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=80657a89-07d8-4355-a80e-f13874579df8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.579 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.583 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 80657a89-07d8-4355-a80e-f13874579df8 in datapath 88c27d4f-052b-4040-8dc7-91a7fc24ef8c unbound from our chassis
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.585 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88c27d4f-052b-4040-8dc7-91a7fc24ef8c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.586 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a316bfc8-2ff5-44e9-b3de-f1b177b8aff5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.587 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c namespace which is not needed anymore
Nov 24 14:35:15 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 24 14:35:15 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 17.727s CPU time.
Nov 24 14:35:15 compute-0 systemd-machined[153483]: Machine qemu-6-instance-00000006 terminated.
Nov 24 14:35:15 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [NOTICE]   (215922) : haproxy version is 2.8.14-c23fe91
Nov 24 14:35:15 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [NOTICE]   (215922) : path to executable is /usr/sbin/haproxy
Nov 24 14:35:15 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [WARNING]  (215922) : Exiting Master process...
Nov 24 14:35:15 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [ALERT]    (215922) : Current worker (215924) exited with code 143 (Terminated)
Nov 24 14:35:15 compute-0 neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c[215918]: [WARNING]  (215922) : All workers exited. Exiting... (0)
Nov 24 14:35:15 compute-0 systemd[1]: libpod-cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d.scope: Deactivated successfully.
Nov 24 14:35:15 compute-0 podman[216737]: 2025-11-24 14:35:15.717343147 +0000 UTC m=+0.041257836 container died cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:35:15 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d-userdata-shm.mount: Deactivated successfully.
Nov 24 14:35:15 compute-0 systemd[1]: var-lib-containers-storage-overlay-fd0de350e27a4bc18a33fd6a9366a9729f17239c500ba1354741ca17159745bc-merged.mount: Deactivated successfully.
Nov 24 14:35:15 compute-0 podman[216737]: 2025-11-24 14:35:15.747695683 +0000 UTC m=+0.071610372 container cleanup cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 14:35:15 compute-0 systemd[1]: libpod-conmon-cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d.scope: Deactivated successfully.
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.777 187122 INFO nova.virt.libvirt.driver [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Instance destroyed successfully.
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.778 187122 DEBUG nova.objects.instance [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 70f125d3-772c-4512-89cd-87864bebf8cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.793 187122 DEBUG nova.virt.libvirt.vif [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:33:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-814470289',display_name='tempest-TestNetworkBasicOps-server-814470289',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-814470289',id=6,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNxVUOsd/7KtYLgQBtlHKzBeWF9UhFxiZgEb7YLnyBIIN1OVKJ0gJRpD8NWMGNkw7u8jH0JIXAWvXNBkBhNhVRmW7IlL2b/guGzfz0SVJ7p7J0ywko8iMgOfh8p0fPQCuw==',key_name='tempest-TestNetworkBasicOps-1052892258',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:33:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-00ajpf73',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:33:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=70f125d3-772c-4512-89cd-87864bebf8cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.794 187122 DEBUG nova.network.os_vif_util [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.794 187122 DEBUG nova.network.os_vif_util [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.794 187122 DEBUG os_vif [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.795 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.796 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap80657a89-07, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.797 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.798 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.800 187122 INFO os_vif [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:5b:4f,bridge_name='br-int',has_traffic_filtering=True,id=80657a89-07d8-4355-a80e-f13874579df8,network=Network(88c27d4f-052b-4040-8dc7-91a7fc24ef8c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap80657a89-07')
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.800 187122 INFO nova.virt.libvirt.driver [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Deleting instance files /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc_del
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.801 187122 INFO nova.virt.libvirt.driver [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Deletion of /var/lib/nova/instances/70f125d3-772c-4512-89cd-87864bebf8cc_del complete
Nov 24 14:35:15 compute-0 podman[216772]: 2025-11-24 14:35:15.812738057 +0000 UTC m=+0.041628086 container remove cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.818 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1f52b1-1e12-4bab-a6b8-a9a19dd04039]: (4, ('Mon Nov 24 02:35:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c (cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d)\ncc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d\nMon Nov 24 02:35:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c (cc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d)\ncc1d48a66e2bef9f907bd4c82e19246323017a57694f2b61b609ba32ddc6124d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.820 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[cefdefc1-0510-4b2b-90e8-0de8b52251f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.822 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88c27d4f-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:15 compute-0 kernel: tap88c27d4f-00: left promiscuous mode
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.824 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.836 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.838 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[6b234d82-8a05-47dd-b45d-e44bc2d039b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.854 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1f25a044-b67c-4306-909e-cb9317683ac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.855 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[adf14be6-b0ec-4ddf-8354-0459e47fc0bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.866 187122 INFO nova.compute.manager [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Took 0.34 seconds to destroy the instance on the hypervisor.
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.866 187122 DEBUG oslo.service.loopingcall [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.867 187122 DEBUG nova.compute.manager [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:35:15 compute-0 nova_compute[187118]: 2025-11-24 14:35:15.867 187122 DEBUG nova.network.neutron [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.870 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b07dcea3-2480-4b2d-9c9d-627c23dc8520]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 308600, 'reachable_time': 18183, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216795, 'error': None, 'target': 'ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:15 compute-0 systemd[1]: run-netns-ovnmeta\x2d88c27d4f\x2d052b\x2d4040\x2d8dc7\x2d91a7fc24ef8c.mount: Deactivated successfully.
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.873 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88c27d4f-052b-4040-8dc7-91a7fc24ef8c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:35:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:15.873 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[a2286e15-7e28-4ec6-9c11-b4e2f96cf3ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.475 187122 DEBUG nova.network.neutron [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.493 187122 INFO nova.compute.manager [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Took 0.63 seconds to deallocate network for instance.
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.529 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.529 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.547 187122 DEBUG nova.compute.manager [req-685733ae-fbbc-4e0a-b1e8-3f7ca54b587e req-6f5b1c07-7158-4a9b-9fb8-093ea7f99d1a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-deleted-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.597 187122 DEBUG nova.compute.provider_tree [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.614 187122 DEBUG nova.scheduler.client.report [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.634 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.655 187122 INFO nova.scheduler.client.report [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 70f125d3-772c-4512-89cd-87864bebf8cc
Nov 24 14:35:16 compute-0 nova_compute[187118]: 2025-11-24 14:35:16.709 187122 DEBUG oslo_concurrency.lockutils [None req-3375251e-98a2-48d4-9bc7-2e23c5a64463 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.306 187122 DEBUG nova.network.neutron [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updated VIF entry in instance network info cache for port 80657a89-07d8-4355-a80e-f13874579df8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.306 187122 DEBUG nova.network.neutron [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Updating instance_info_cache with network_info: [{"id": "80657a89-07d8-4355-a80e-f13874579df8", "address": "fa:16:3e:74:5b:4f", "network": {"id": "88c27d4f-052b-4040-8dc7-91a7fc24ef8c", "bridge": "br-int", "label": "tempest-network-smoke--789241292", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap80657a89-07", "ovs_interfaceid": "80657a89-07d8-4355-a80e-f13874579df8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.331 187122 DEBUG oslo_concurrency.lockutils [req-75ae60eb-cca9-42fe-abbf-c3494e890e89 req-ac90b140-233b-4f07-a40c-4b66532c9390 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-70f125d3-772c-4512-89cd-87864bebf8cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.522 187122 DEBUG nova.compute.manager [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-unplugged-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.522 187122 DEBUG oslo_concurrency.lockutils [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.522 187122 DEBUG oslo_concurrency.lockutils [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.523 187122 DEBUG oslo_concurrency.lockutils [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.523 187122 DEBUG nova.compute.manager [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-unplugged-80657a89-07d8-4355-a80e-f13874579df8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.523 187122 WARNING nova.compute.manager [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-unplugged-80657a89-07d8-4355-a80e-f13874579df8 for instance with vm_state deleted and task_state None.
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.523 187122 DEBUG nova.compute.manager [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.523 187122 DEBUG oslo_concurrency.lockutils [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.524 187122 DEBUG oslo_concurrency.lockutils [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.524 187122 DEBUG oslo_concurrency.lockutils [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "70f125d3-772c-4512-89cd-87864bebf8cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.524 187122 DEBUG nova.compute.manager [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] No waiting events found dispatching network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:17 compute-0 nova_compute[187118]: 2025-11-24 14:35:17.524 187122 WARNING nova.compute.manager [req-c4b8b9cc-8b04-4637-a0fe-fd9b9379ae5f req-c47807a8-1011-4198-a5d1-96d07f26e79e 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Received unexpected event network-vif-plugged-80657a89-07d8-4355-a80e-f13874579df8 for instance with vm_state deleted and task_state None.
Nov 24 14:35:19 compute-0 podman[216797]: 2025-11-24 14:35:19.518728354 +0000 UTC m=+0.104950852 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 14:35:19 compute-0 podman[216796]: 2025-11-24 14:35:19.522956579 +0000 UTC m=+0.103836052 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:35:19 compute-0 nova_compute[187118]: 2025-11-24 14:35:19.868 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:20 compute-0 nova_compute[187118]: 2025-11-24 14:35:20.356 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:20 compute-0 nova_compute[187118]: 2025-11-24 14:35:20.461 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:20 compute-0 nova_compute[187118]: 2025-11-24 14:35:20.764 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:20 compute-0 nova_compute[187118]: 2025-11-24 14:35:20.797 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:21 compute-0 nova_compute[187118]: 2025-11-24 14:35:21.610 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994906.609187, 9b3efeab-7379-4e78-8df8-032e6e66cd67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:35:21 compute-0 nova_compute[187118]: 2025-11-24 14:35:21.611 187122 INFO nova.compute.manager [-] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] VM Stopped (Lifecycle Event)
Nov 24 14:35:21 compute-0 nova_compute[187118]: 2025-11-24 14:35:21.640 187122 DEBUG nova.compute.manager [None req-eb4cf1d2-8385-434b-95cc-1470b6d2bff4 - - - - - -] [instance: 9b3efeab-7379-4e78-8df8-032e6e66cd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:35:24 compute-0 nova_compute[187118]: 2025-11-24 14:35:24.870 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:25 compute-0 podman[216837]: 2025-11-24 14:35:25.488428114 +0000 UTC m=+0.078136633 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.6, architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, distribution-scope=public, config_id=edpm)
Nov 24 14:35:25 compute-0 podman[216836]: 2025-11-24 14:35:25.504837867 +0000 UTC m=+0.097224470 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Nov 24 14:35:25 compute-0 nova_compute[187118]: 2025-11-24 14:35:25.800 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:29 compute-0 nova_compute[187118]: 2025-11-24 14:35:29.873 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:30 compute-0 nova_compute[187118]: 2025-11-24 14:35:30.776 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994915.7751398, 70f125d3-772c-4512-89cd-87864bebf8cc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:35:30 compute-0 nova_compute[187118]: 2025-11-24 14:35:30.777 187122 INFO nova.compute.manager [-] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] VM Stopped (Lifecycle Event)
Nov 24 14:35:30 compute-0 nova_compute[187118]: 2025-11-24 14:35:30.790 187122 DEBUG nova.compute.manager [None req-8406c02e-c1e4-4d37-acb1-08287d0422e1 - - - - - -] [instance: 70f125d3-772c-4512-89cd-87864bebf8cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:35:30 compute-0 nova_compute[187118]: 2025-11-24 14:35:30.802 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:31 compute-0 podman[216882]: 2025-11-24 14:35:31.438314356 +0000 UTC m=+0.045465570 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:35:34 compute-0 nova_compute[187118]: 2025-11-24 14:35:34.876 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.129 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:35:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:35:35 compute-0 nova_compute[187118]: 2025-11-24 14:35:35.804 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:35 compute-0 nova_compute[187118]: 2025-11-24 14:35:35.815 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:37 compute-0 nova_compute[187118]: 2025-11-24 14:35:37.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.841 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.841 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.868 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.962 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.963 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.976 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:35:38 compute-0 nova_compute[187118]: 2025-11-24 14:35:38.977 187122 INFO nova.compute.claims [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.079 187122 DEBUG nova.compute.provider_tree [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.093 187122 DEBUG nova.scheduler.client.report [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.115 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.116 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.169 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.170 187122 DEBUG nova.network.neutron [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.189 187122 INFO nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.207 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.321 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.324 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.324 187122 INFO nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Creating image(s)
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.325 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.326 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.327 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.356 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.409 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.410 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.411 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.421 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.470 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.472 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.511 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.512 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.513 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.568 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.570 187122 DEBUG nova.virt.disk.api [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.571 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.625 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.626 187122 DEBUG nova.virt.disk.api [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.626 187122 DEBUG nova.objects.instance [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid f83c0875-7d40-4037-8b77-2fea9c1fd962 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.643 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.643 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Ensure instance console log exists: /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.644 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.644 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.645 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.823 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.825 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:35:39 compute-0 nova_compute[187118]: 2025-11-24 14:35:39.879 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.035 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.037 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5773MB free_disk=73.45866012573242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.038 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.038 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.114 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance f83c0875-7d40-4037-8b77-2fea9c1fd962 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.114 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.115 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.182 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.197 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.223 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.224 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:40 compute-0 podman[216922]: 2025-11-24 14:35:40.457770672 +0000 UTC m=+0.063397925 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:35:40 compute-0 nova_compute[187118]: 2025-11-24 14:35:40.806 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:41 compute-0 nova_compute[187118]: 2025-11-24 14:35:41.224 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:41 compute-0 nova_compute[187118]: 2025-11-24 14:35:41.298 187122 DEBUG nova.policy [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:35:41 compute-0 nova_compute[187118]: 2025-11-24 14:35:41.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:41 compute-0 nova_compute[187118]: 2025-11-24 14:35:41.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:35:41 compute-0 nova_compute[187118]: 2025-11-24 14:35:41.821 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:35:42 compute-0 nova_compute[187118]: 2025-11-24 14:35:42.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:42 compute-0 nova_compute[187118]: 2025-11-24 14:35:42.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:42 compute-0 nova_compute[187118]: 2025-11-24 14:35:42.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.097 187122 DEBUG nova.network.neutron [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Successfully updated port: b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.119 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.119 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.119 187122 DEBUG nova.network.neutron [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.211 187122 DEBUG nova.compute.manager [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received event network-changed-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.211 187122 DEBUG nova.compute.manager [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Refreshing instance network info cache due to event network-changed-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.211 187122 DEBUG oslo_concurrency.lockutils [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.296 187122 DEBUG nova.network.neutron [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:35:44 compute-0 nova_compute[187118]: 2025-11-24 14:35:44.879 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:45 compute-0 nova_compute[187118]: 2025-11-24 14:35:45.810 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.193 187122 DEBUG nova.network.neutron [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Updating instance_info_cache with network_info: [{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.216 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.217 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Instance network_info: |[{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.217 187122 DEBUG oslo_concurrency.lockutils [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.218 187122 DEBUG nova.network.neutron [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Refreshing network info cache for port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.223 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Start _get_guest_xml network_info=[{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.228 187122 WARNING nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.234 187122 DEBUG nova.virt.libvirt.host [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.235 187122 DEBUG nova.virt.libvirt.host [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.241 187122 DEBUG nova.virt.libvirt.host [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.241 187122 DEBUG nova.virt.libvirt.host [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.242 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.242 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.242 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.242 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.242 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.242 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.243 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.243 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.243 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.243 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.243 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.243 187122 DEBUG nova.virt.hardware [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.246 187122 DEBUG nova.virt.libvirt.vif [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:35:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1238170478',display_name='tempest-TestNetworkBasicOps-server-1238170478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1238170478',id=8,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMaDtCizPr2OTMBcR4KngGC/4drXt8pG7rxn0jVTm+JX2cnEC5ptsPiwBoDklIvKaRDV7zl/PtsiaFB2x6gc2zVkIyCx4hPyyxDt9rd4z1VxZQTL/NJ8Op7JjeVTcTcD5Q==',key_name='tempest-TestNetworkBasicOps-1749377530',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-k5yld21n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:35:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=f83c0875-7d40-4037-8b77-2fea9c1fd962,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.246 187122 DEBUG nova.network.os_vif_util [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.247 187122 DEBUG nova.network.os_vif_util [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.247 187122 DEBUG nova.objects.instance [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid f83c0875-7d40-4037-8b77-2fea9c1fd962 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.263 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <uuid>f83c0875-7d40-4037-8b77-2fea9c1fd962</uuid>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <name>instance-00000008</name>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-1238170478</nova:name>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:35:46</nova:creationTime>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         <nova:port uuid="b93ae7b1-0bfb-43d8-8b9e-c0584b9161af">
Nov 24 14:35:46 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <system>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <entry name="serial">f83c0875-7d40-4037-8b77-2fea9c1fd962</entry>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <entry name="uuid">f83c0875-7d40-4037-8b77-2fea9c1fd962</entry>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </system>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <os>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </os>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <features>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </features>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.config"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:36:56:b3"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <target dev="tapb93ae7b1-0b"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/console.log" append="off"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <video>
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </video>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:35:46 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:35:46 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:35:46 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:35:46 compute-0 nova_compute[187118]: </domain>
Nov 24 14:35:46 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.265 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Preparing to wait for external event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.265 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.266 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.266 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.267 187122 DEBUG nova.virt.libvirt.vif [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:35:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1238170478',display_name='tempest-TestNetworkBasicOps-server-1238170478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1238170478',id=8,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMaDtCizPr2OTMBcR4KngGC/4drXt8pG7rxn0jVTm+JX2cnEC5ptsPiwBoDklIvKaRDV7zl/PtsiaFB2x6gc2zVkIyCx4hPyyxDt9rd4z1VxZQTL/NJ8Op7JjeVTcTcD5Q==',key_name='tempest-TestNetworkBasicOps-1749377530',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-k5yld21n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:35:39Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=f83c0875-7d40-4037-8b77-2fea9c1fd962,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.268 187122 DEBUG nova.network.os_vif_util [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.268 187122 DEBUG nova.network.os_vif_util [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.269 187122 DEBUG os_vif [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.270 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.270 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.271 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.276 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.277 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb93ae7b1-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.277 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb93ae7b1-0b, col_values=(('external_ids', {'iface-id': 'b93ae7b1-0bfb-43d8-8b9e-c0584b9161af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:56:b3', 'vm-uuid': 'f83c0875-7d40-4037-8b77-2fea9c1fd962'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.279 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:46 compute-0 NetworkManager[55697]: <info>  [1763994946.2813] manager: (tapb93ae7b1-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.282 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.289 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.290 187122 INFO os_vif [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b')
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.356 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.356 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.357 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:36:56:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:35:46 compute-0 nova_compute[187118]: 2025-11-24 14:35:46.357 187122 INFO nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Using config drive
Nov 24 14:35:46 compute-0 podman[216950]: 2025-11-24 14:35:46.467323785 +0000 UTC m=+0.072939089 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118)
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.350 187122 INFO nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Creating config drive at /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.config
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.354 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwha0wu1v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.490 187122 DEBUG oslo_concurrency.processutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwha0wu1v" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:35:47 compute-0 kernel: tapb93ae7b1-0b: entered promiscuous mode
Nov 24 14:35:47 compute-0 ovn_controller[95613]: 2025-11-24T14:35:47Z|00108|binding|INFO|Claiming lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for this chassis.
Nov 24 14:35:47 compute-0 ovn_controller[95613]: 2025-11-24T14:35:47Z|00109|binding|INFO|b93ae7b1-0bfb-43d8-8b9e-c0584b9161af: Claiming fa:16:3e:36:56:b3 10.100.0.8
Nov 24 14:35:47 compute-0 NetworkManager[55697]: <info>  [1763994947.5596] manager: (tapb93ae7b1-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.560 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.568 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.576 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:56:b3 10.100.0.8'], port_security=['fa:16:3e:36:56:b3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f83c0875-7d40-4037-8b77-2fea9c1fd962', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2f7949-6df8-49f4-9577-8ba82e7f7173, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.577 104469 INFO neutron.agent.ovn.metadata.agent [-] Port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af in datapath a0bdebee-864c-45a0-b54c-1e06d962d72e bound to our chassis
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.578 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0bdebee-864c-45a0-b54c-1e06d962d72e
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.589 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f128de40-c387-42be-866e-57470c793bef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.589 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa0bdebee-81 in ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.591 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa0bdebee-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.591 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a295c0e4-a0e1-42bd-b4d2-e5fa36b6f712]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.592 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7d794f51-5e3a-4a4a-83a6-955c45191b0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 systemd-udevd[216989]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.607 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[e5349b11-ec45-4b37-9ee5-6b96b8156bb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 systemd-machined[153483]: New machine qemu-8-instance-00000008.
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.619 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:47 compute-0 NetworkManager[55697]: <info>  [1763994947.6210] device (tapb93ae7b1-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:35:47 compute-0 NetworkManager[55697]: <info>  [1763994947.6223] device (tapb93ae7b1-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:35:47 compute-0 ovn_controller[95613]: 2025-11-24T14:35:47Z|00110|binding|INFO|Setting lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af ovn-installed in OVS
Nov 24 14:35:47 compute-0 ovn_controller[95613]: 2025-11-24T14:35:47Z|00111|binding|INFO|Setting lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af up in Southbound
Nov 24 14:35:47 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.625 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.629 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9f4a66-8162-43c4-8712-51a212c962e0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.665 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5cadc9-ede0-4181-b3ee-eb3259c3c3c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.671 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2565fa79-5ec8-456c-afc2-ab22e329e4fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 NetworkManager[55697]: <info>  [1763994947.6723] manager: (tapa0bdebee-80): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.715 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[16e3b946-197a-4a30-a01a-927fd42985e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.719 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[413592b4-0139-42fe-a666-da9f04465dfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 NetworkManager[55697]: <info>  [1763994947.7521] device (tapa0bdebee-80): carrier: link connected
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.757 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d712d7-8fed-46e3-841a-d99c4533d60a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.774 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e70d255d-5d19-4616-8667-12271b2bcf00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0bdebee-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:65:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 321518, 'reachable_time': 35717, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217020, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.789 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f2df1a85-7d38-4300-a2d9-6ba97dc5d2c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:6500'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 321518, 'tstamp': 321518}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217021, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.803 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3e411d64-c366-4cbb-a5e9-b707355f1b1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0bdebee-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:65:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 321518, 'reachable_time': 35717, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217022, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.838 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[adda5790-87f2-4555-8286-c0b35a203698]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.916 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0ed14be4-66c7-40e6-a002-337ecb12c4f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.918 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0bdebee-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.919 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.920 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0bdebee-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:47 compute-0 NetworkManager[55697]: <info>  [1763994947.9223] manager: (tapa0bdebee-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.922 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:47 compute-0 kernel: tapa0bdebee-80: entered promiscuous mode
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.924 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.925 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0bdebee-80, col_values=(('external_ids', {'iface-id': 'f28333b2-9b09-47b9-87c5-4456a99747de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:47 compute-0 ovn_controller[95613]: 2025-11-24T14:35:47Z|00112|binding|INFO|Releasing lport f28333b2-9b09-47b9-87c5-4456a99747de from this chassis (sb_readonly=0)
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.929 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0bdebee-864c-45a0-b54c-1e06d962d72e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0bdebee-864c-45a0-b54c-1e06d962d72e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.929 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3d3a5ec5-ca9a-4d32-a66b-5f5a02828b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.930 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-a0bdebee-864c-45a0-b54c-1e06d962d72e
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/a0bdebee-864c-45a0-b54c-1e06d962d72e.pid.haproxy
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID a0bdebee-864c-45a0-b54c-1e06d962d72e
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:35:47 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:47.932 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'env', 'PROCESS_TAG=haproxy-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a0bdebee-864c-45a0-b54c-1e06d962d72e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:35:47 compute-0 nova_compute[187118]: 2025-11-24 14:35:47.939 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.169 187122 DEBUG nova.compute.manager [req-4d94e77e-a997-41d1-8664-5f6a1a7c68d9 req-4b5ad76b-2d40-4140-aea1-22c39f4b82a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.170 187122 DEBUG oslo_concurrency.lockutils [req-4d94e77e-a997-41d1-8664-5f6a1a7c68d9 req-4b5ad76b-2d40-4140-aea1-22c39f4b82a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.171 187122 DEBUG oslo_concurrency.lockutils [req-4d94e77e-a997-41d1-8664-5f6a1a7c68d9 req-4b5ad76b-2d40-4140-aea1-22c39f4b82a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.172 187122 DEBUG oslo_concurrency.lockutils [req-4d94e77e-a997-41d1-8664-5f6a1a7c68d9 req-4b5ad76b-2d40-4140-aea1-22c39f4b82a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.172 187122 DEBUG nova.compute.manager [req-4d94e77e-a997-41d1-8664-5f6a1a7c68d9 req-4b5ad76b-2d40-4140-aea1-22c39f4b82a7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Processing event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:35:48 compute-0 podman[217054]: 2025-11-24 14:35:48.292977894 +0000 UTC m=+0.055309121 container create 88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Nov 24 14:35:48 compute-0 systemd[1]: Started libpod-conmon-88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962.scope.
Nov 24 14:35:48 compute-0 podman[217054]: 2025-11-24 14:35:48.259025425 +0000 UTC m=+0.021356682 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:35:48 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:35:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52c5ce220ccc09b8669aef236c35c6a4f1daeda5b8ebda88bb1270cc7c3826e7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:35:48 compute-0 podman[217054]: 2025-11-24 14:35:48.393002562 +0000 UTC m=+0.155333789 container init 88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:35:48 compute-0 podman[217054]: 2025-11-24 14:35:48.398553956 +0000 UTC m=+0.160885163 container start 88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.404 187122 DEBUG nova.network.neutron [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Updated VIF entry in instance network info cache for port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.405 187122 DEBUG nova.network.neutron [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Updating instance_info_cache with network_info: [{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:48 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [NOTICE]   (217079) : New worker (217082) forked
Nov 24 14:35:48 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [NOTICE]   (217079) : Loading success.
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.419 187122 DEBUG oslo_concurrency.lockutils [req-052aa3bd-57c9-435c-a799-30168177ccf3 req-f3d08b20-0190-4b68-8c5d-afab341594a1 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.478 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.479 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994948.4782376, f83c0875-7d40-4037-8b77-2fea9c1fd962 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.480 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] VM Started (Lifecycle Event)
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.482 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.486 187122 INFO nova.virt.libvirt.driver [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Instance spawned successfully.
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.487 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.513 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.519 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.524 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.525 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.525 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.526 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.526 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.527 187122 DEBUG nova.virt.libvirt.driver [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.553 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.554 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994948.4784842, f83c0875-7d40-4037-8b77-2fea9c1fd962 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.554 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] VM Paused (Lifecycle Event)
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.577 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.581 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994948.481869, f83c0875-7d40-4037-8b77-2fea9c1fd962 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.581 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] VM Resumed (Lifecycle Event)
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.588 187122 INFO nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Took 9.27 seconds to spawn the instance on the hypervisor.
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.588 187122 DEBUG nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.598 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.602 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.630 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.649 187122 INFO nova.compute.manager [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Took 9.73 seconds to build instance.
Nov 24 14:35:48 compute-0 nova_compute[187118]: 2025-11-24 14:35:48.664 187122 DEBUG oslo_concurrency.lockutils [None req-9de32ebc-7c7c-4592-9c9c-cbeb8b7046ed ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.823s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:49 compute-0 nova_compute[187118]: 2025-11-24 14:35:49.881 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:50 compute-0 nova_compute[187118]: 2025-11-24 14:35:50.284 187122 DEBUG nova.compute.manager [req-5bdc5272-dd2d-40ee-979b-3c791087c3ba req-dfa3ec74-17d1-4004-9a22-a489f124e6df 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:50 compute-0 nova_compute[187118]: 2025-11-24 14:35:50.285 187122 DEBUG oslo_concurrency.lockutils [req-5bdc5272-dd2d-40ee-979b-3c791087c3ba req-dfa3ec74-17d1-4004-9a22-a489f124e6df 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:50 compute-0 nova_compute[187118]: 2025-11-24 14:35:50.286 187122 DEBUG oslo_concurrency.lockutils [req-5bdc5272-dd2d-40ee-979b-3c791087c3ba req-dfa3ec74-17d1-4004-9a22-a489f124e6df 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:50 compute-0 nova_compute[187118]: 2025-11-24 14:35:50.286 187122 DEBUG oslo_concurrency.lockutils [req-5bdc5272-dd2d-40ee-979b-3c791087c3ba req-dfa3ec74-17d1-4004-9a22-a489f124e6df 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:50 compute-0 nova_compute[187118]: 2025-11-24 14:35:50.287 187122 DEBUG nova.compute.manager [req-5bdc5272-dd2d-40ee-979b-3c791087c3ba req-dfa3ec74-17d1-4004-9a22-a489f124e6df 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] No waiting events found dispatching network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:50 compute-0 nova_compute[187118]: 2025-11-24 14:35:50.288 187122 WARNING nova.compute.manager [req-5bdc5272-dd2d-40ee-979b-3c791087c3ba req-dfa3ec74-17d1-4004-9a22-a489f124e6df 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received unexpected event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for instance with vm_state active and task_state None.
Nov 24 14:35:50 compute-0 podman[217092]: 2025-11-24 14:35:50.46803068 +0000 UTC m=+0.076041535 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_id=multipathd)
Nov 24 14:35:50 compute-0 podman[217093]: 2025-11-24 14:35:50.485068872 +0000 UTC m=+0.083175062 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:35:51 compute-0 nova_compute[187118]: 2025-11-24 14:35:51.281 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:54 compute-0 NetworkManager[55697]: <info>  [1763994954.5904] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 24 14:35:54 compute-0 NetworkManager[55697]: <info>  [1763994954.5914] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 24 14:35:54 compute-0 nova_compute[187118]: 2025-11-24 14:35:54.590 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:54 compute-0 ovn_controller[95613]: 2025-11-24T14:35:54Z|00113|binding|INFO|Releasing lport f28333b2-9b09-47b9-87c5-4456a99747de from this chassis (sb_readonly=0)
Nov 24 14:35:54 compute-0 nova_compute[187118]: 2025-11-24 14:35:54.614 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:54 compute-0 ovn_controller[95613]: 2025-11-24T14:35:54Z|00114|binding|INFO|Releasing lport f28333b2-9b09-47b9-87c5-4456a99747de from this chassis (sb_readonly=0)
Nov 24 14:35:54 compute-0 nova_compute[187118]: 2025-11-24 14:35:54.617 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:54 compute-0 nova_compute[187118]: 2025-11-24 14:35:54.883 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.439 187122 DEBUG nova.compute.manager [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received event network-changed-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.440 187122 DEBUG nova.compute.manager [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Refreshing instance network info cache due to event network-changed-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.440 187122 DEBUG oslo_concurrency.lockutils [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.441 187122 DEBUG oslo_concurrency.lockutils [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.442 187122 DEBUG nova.network.neutron [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Refreshing network info cache for port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.579 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.580 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.581 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.581 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.582 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.583 187122 INFO nova.compute.manager [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Terminating instance
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.584 187122 DEBUG nova.compute.manager [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:35:55 compute-0 kernel: tapb93ae7b1-0b (unregistering): left promiscuous mode
Nov 24 14:35:55 compute-0 NetworkManager[55697]: <info>  [1763994955.6100] device (tapb93ae7b1-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.621 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 ovn_controller[95613]: 2025-11-24T14:35:55Z|00115|binding|INFO|Releasing lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af from this chassis (sb_readonly=0)
Nov 24 14:35:55 compute-0 ovn_controller[95613]: 2025-11-24T14:35:55Z|00116|binding|INFO|Setting lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af down in Southbound
Nov 24 14:35:55 compute-0 ovn_controller[95613]: 2025-11-24T14:35:55Z|00117|binding|INFO|Removing iface tapb93ae7b1-0b ovn-installed in OVS
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.633 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:56:b3 10.100.0.8'], port_security=['fa:16:3e:36:56:b3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'f83c0875-7d40-4037-8b77-2fea9c1fd962', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2f7949-6df8-49f4-9577-8ba82e7f7173, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.634 104469 INFO neutron.agent.ovn.metadata.agent [-] Port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af in datapath a0bdebee-864c-45a0-b54c-1e06d962d72e unbound from our chassis
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.635 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0bdebee-864c-45a0-b54c-1e06d962d72e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.636 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a78161-96ff-4c96-a4e5-81683d955a57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.636 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e namespace which is not needed anymore
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.649 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Nov 24 14:35:55 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 8.157s CPU time.
Nov 24 14:35:55 compute-0 systemd-machined[153483]: Machine qemu-8-instance-00000008 terminated.
Nov 24 14:35:55 compute-0 podman[217135]: 2025-11-24 14:35:55.759679751 +0000 UTC m=+0.096321665 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, container_name=openstack_network_exporter)
Nov 24 14:35:55 compute-0 podman[217132]: 2025-11-24 14:35:55.79540228 +0000 UTC m=+0.136661052 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Nov 24 14:35:55 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [NOTICE]   (217079) : haproxy version is 2.8.14-c23fe91
Nov 24 14:35:55 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [NOTICE]   (217079) : path to executable is /usr/sbin/haproxy
Nov 24 14:35:55 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [WARNING]  (217079) : Exiting Master process...
Nov 24 14:35:55 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [ALERT]    (217079) : Current worker (217082) exited with code 143 (Terminated)
Nov 24 14:35:55 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217070]: [WARNING]  (217079) : All workers exited. Exiting... (0)
Nov 24 14:35:55 compute-0 systemd[1]: libpod-88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962.scope: Deactivated successfully.
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.803 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 podman[217194]: 2025-11-24 14:35:55.808148753 +0000 UTC m=+0.052007190 container died 88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.808 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962-userdata-shm.mount: Deactivated successfully.
Nov 24 14:35:55 compute-0 systemd[1]: var-lib-containers-storage-overlay-52c5ce220ccc09b8669aef236c35c6a4f1daeda5b8ebda88bb1270cc7c3826e7-merged.mount: Deactivated successfully.
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.845 187122 INFO nova.virt.libvirt.driver [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Instance destroyed successfully.
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.845 187122 DEBUG nova.objects.instance [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid f83c0875-7d40-4037-8b77-2fea9c1fd962 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:35:55 compute-0 podman[217194]: 2025-11-24 14:35:55.849048074 +0000 UTC m=+0.092906511 container cleanup 88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:35:55 compute-0 systemd[1]: libpod-conmon-88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962.scope: Deactivated successfully.
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.860 187122 DEBUG nova.virt.libvirt.vif [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:35:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1238170478',display_name='tempest-TestNetworkBasicOps-server-1238170478',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1238170478',id=8,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMaDtCizPr2OTMBcR4KngGC/4drXt8pG7rxn0jVTm+JX2cnEC5ptsPiwBoDklIvKaRDV7zl/PtsiaFB2x6gc2zVkIyCx4hPyyxDt9rd4z1VxZQTL/NJ8Op7JjeVTcTcD5Q==',key_name='tempest-TestNetworkBasicOps-1749377530',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:35:48Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-k5yld21n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:35:48Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=f83c0875-7d40-4037-8b77-2fea9c1fd962,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.861 187122 DEBUG nova.network.os_vif_util [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.861 187122 DEBUG nova.network.os_vif_util [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.861 187122 DEBUG os_vif [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.863 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.863 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb93ae7b1-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.867 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.869 187122 INFO os_vif [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b')
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.869 187122 INFO nova.virt.libvirt.driver [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Deleting instance files /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962_del
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.870 187122 INFO nova.virt.libvirt.driver [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Deletion of /var/lib/nova/instances/f83c0875-7d40-4037-8b77-2fea9c1fd962_del complete
Nov 24 14:35:55 compute-0 podman[217243]: 2025-11-24 14:35:55.909140877 +0000 UTC m=+0.041120099 container remove 88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.913 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7a48a0c1-decd-42dd-9635-3b09e79b90fd]: (4, ('Mon Nov 24 02:35:55 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e (88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962)\n88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962\nMon Nov 24 02:35:55 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e (88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962)\n88f909977c69d40512c364c76df4e78e3b8208e93fee0fdf76dbc442ed1c5962\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.915 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[10d29dd1-5031-40a7-a6ce-5a6418f4f946]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.916 187122 INFO nova.compute.manager [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Took 0.33 seconds to destroy the instance on the hypervisor.
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.916 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0bdebee-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.917 187122 DEBUG oslo.service.loopingcall [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.917 187122 DEBUG nova.compute.manager [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.917 187122 DEBUG nova.network.neutron [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:35:55 compute-0 kernel: tapa0bdebee-80: left promiscuous mode
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.920 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 nova_compute[187118]: 2025-11-24 14:35:55.936 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.939 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a1517537-2f02-4727-95d8-6deb05be7959]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.962 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[6dfc4bf1-486d-4f40-9abd-55820a505642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.963 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[1f01f3f2-9f69-490d-9699-f5378d3e7ea6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.980 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[13862ef9-a49f-4efe-9963-f6287188bfb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 321509, 'reachable_time': 44092, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217258, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.982 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:35:55 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:55.982 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[0a94140a-0006-46a5-8b62-0cf8246c1b89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:35:55 compute-0 systemd[1]: run-netns-ovnmeta\x2da0bdebee\x2d864c\x2d45a0\x2db54c\x2d1e06d962d72e.mount: Deactivated successfully.
Nov 24 14:35:56 compute-0 nova_compute[187118]: 2025-11-24 14:35:56.605 187122 DEBUG nova.network.neutron [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Updated VIF entry in instance network info cache for port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:35:56 compute-0 nova_compute[187118]: 2025-11-24 14:35:56.606 187122 DEBUG nova.network.neutron [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Updating instance_info_cache with network_info: [{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:56 compute-0 nova_compute[187118]: 2025-11-24 14:35:56.625 187122 DEBUG oslo_concurrency.lockutils [req-227d1fa9-d189-4082-84e0-29e3333110b7 req-9a79e407-e1a4-4582-a043-d642905ab61a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-f83c0875-7d40-4037-8b77-2fea9c1fd962" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:35:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:56.662 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:56.663 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:35:56.663 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.402 187122 DEBUG nova.network.neutron [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.411 187122 INFO nova.compute.manager [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Took 1.49 seconds to deallocate network for instance.
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.470 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.471 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.524 187122 DEBUG nova.compute.manager [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received event network-vif-unplugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.524 187122 DEBUG oslo_concurrency.lockutils [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.525 187122 DEBUG oslo_concurrency.lockutils [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.525 187122 DEBUG oslo_concurrency.lockutils [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.525 187122 DEBUG nova.compute.manager [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] No waiting events found dispatching network-vif-unplugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.525 187122 WARNING nova.compute.manager [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received unexpected event network-vif-unplugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for instance with vm_state deleted and task_state None.
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.525 187122 DEBUG nova.compute.manager [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.526 187122 DEBUG oslo_concurrency.lockutils [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.526 187122 DEBUG oslo_concurrency.lockutils [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.526 187122 DEBUG oslo_concurrency.lockutils [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.526 187122 DEBUG nova.compute.manager [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] No waiting events found dispatching network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.526 187122 WARNING nova.compute.manager [req-fe58baa9-c20e-45c6-b87f-ddb3f2a31403 req-314ffc65-f3f3-47ce-be56-9cd2106fc207 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Received unexpected event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for instance with vm_state deleted and task_state None.
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.552 187122 DEBUG nova.compute.provider_tree [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.561 187122 DEBUG nova.scheduler.client.report [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.577 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.603 187122 INFO nova.scheduler.client.report [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance f83c0875-7d40-4037-8b77-2fea9c1fd962
Nov 24 14:35:57 compute-0 nova_compute[187118]: 2025-11-24 14:35:57.657 187122 DEBUG oslo_concurrency.lockutils [None req-00cde76a-7f23-4f3b-87a5-babe3041c64f ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "f83c0875-7d40-4037-8b77-2fea9c1fd962" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:35:59 compute-0 nova_compute[187118]: 2025-11-24 14:35:59.885 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:00 compute-0 nova_compute[187118]: 2025-11-24 14:36:00.866 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:02 compute-0 podman[217259]: 2025-11-24 14:36:02.489314426 +0000 UTC m=+0.084456186 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.879 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.879 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.887 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.894 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.976 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.976 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.984 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:36:04 compute-0 nova_compute[187118]: 2025-11-24 14:36:04.984 187122 INFO nova.compute.claims [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.102 187122 DEBUG nova.compute.provider_tree [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.115 187122 DEBUG nova.scheduler.client.report [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.143 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.144 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.192 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.192 187122 DEBUG nova.network.neutron [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.211 187122 INFO nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.229 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.331 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.332 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.333 187122 INFO nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Creating image(s)
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.333 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.333 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.334 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.345 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.403 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.404 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.405 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.415 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.472 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.472 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.505 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.506 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.506 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.576 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.577 187122 DEBUG nova.virt.disk.api [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.577 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.652 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.653 187122 DEBUG nova.virt.disk.api [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.654 187122 DEBUG nova.objects.instance [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 3ddbbdc8-3490-42e4-a549-83bfc6add71f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.665 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.666 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Ensure instance console log exists: /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.667 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.667 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.667 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:05 compute-0 nova_compute[187118]: 2025-11-24 14:36:05.870 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:06 compute-0 nova_compute[187118]: 2025-11-24 14:36:06.204 187122 DEBUG nova.policy [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:36:07 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:07.803 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.804 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:07 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:07.804 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.864 187122 DEBUG nova.network.neutron [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Successfully updated port: b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.880 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-3ddbbdc8-3490-42e4-a549-83bfc6add71f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.880 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-3ddbbdc8-3490-42e4-a549-83bfc6add71f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.880 187122 DEBUG nova.network.neutron [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.951 187122 DEBUG nova.compute.manager [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received event network-changed-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.952 187122 DEBUG nova.compute.manager [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Refreshing instance network info cache due to event network-changed-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:36:07 compute-0 nova_compute[187118]: 2025-11-24 14:36:07.952 187122 DEBUG oslo_concurrency.lockutils [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-3ddbbdc8-3490-42e4-a549-83bfc6add71f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:36:08 compute-0 nova_compute[187118]: 2025-11-24 14:36:08.012 187122 DEBUG nova.network.neutron [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.166 187122 DEBUG nova.network.neutron [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Updating instance_info_cache with network_info: [{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.184 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-3ddbbdc8-3490-42e4-a549-83bfc6add71f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.185 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Instance network_info: |[{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.185 187122 DEBUG oslo_concurrency.lockutils [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-3ddbbdc8-3490-42e4-a549-83bfc6add71f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.186 187122 DEBUG nova.network.neutron [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Refreshing network info cache for port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.189 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Start _get_guest_xml network_info=[{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.194 187122 WARNING nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.199 187122 DEBUG nova.virt.libvirt.host [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.200 187122 DEBUG nova.virt.libvirt.host [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.206 187122 DEBUG nova.virt.libvirt.host [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.207 187122 DEBUG nova.virt.libvirt.host [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.207 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.208 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.208 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.208 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.209 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.209 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.209 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.210 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.210 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.210 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.210 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.211 187122 DEBUG nova.virt.hardware [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.215 187122 DEBUG nova.virt.libvirt.vif [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:36:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1930886610',display_name='tempest-TestNetworkBasicOps-server-1930886610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1930886610',id=9,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMId+ftUDRTIwWkXbF9KcQfD3bQ44HDk8CIn6Usq0+Jtj3QL7HYWS3ChFv+RnxLkj4HoliaP5H5BqbqAR5nSHXV15QqrlEau84IGejg8Y9WbWeBYR++YX/4q/07UpNWTxA==',key_name='tempest-TestNetworkBasicOps-42275247',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-fnauzlxb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:36:05Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=3ddbbdc8-3490-42e4-a549-83bfc6add71f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.216 187122 DEBUG nova.network.os_vif_util [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.216 187122 DEBUG nova.network.os_vif_util [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.217 187122 DEBUG nova.objects.instance [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ddbbdc8-3490-42e4-a549-83bfc6add71f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.231 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <uuid>3ddbbdc8-3490-42e4-a549-83bfc6add71f</uuid>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <name>instance-00000009</name>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-1930886610</nova:name>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:36:09</nova:creationTime>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         <nova:port uuid="b93ae7b1-0bfb-43d8-8b9e-c0584b9161af">
Nov 24 14:36:09 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <system>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <entry name="serial">3ddbbdc8-3490-42e4-a549-83bfc6add71f</entry>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <entry name="uuid">3ddbbdc8-3490-42e4-a549-83bfc6add71f</entry>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </system>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <os>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </os>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <features>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </features>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.config"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:36:56:b3"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <target dev="tapb93ae7b1-0b"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/console.log" append="off"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <video>
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </video>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:36:09 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:36:09 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:36:09 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:36:09 compute-0 nova_compute[187118]: </domain>
Nov 24 14:36:09 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.233 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Preparing to wait for external event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.233 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.234 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.234 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.235 187122 DEBUG nova.virt.libvirt.vif [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:36:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1930886610',display_name='tempest-TestNetworkBasicOps-server-1930886610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1930886610',id=9,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMId+ftUDRTIwWkXbF9KcQfD3bQ44HDk8CIn6Usq0+Jtj3QL7HYWS3ChFv+RnxLkj4HoliaP5H5BqbqAR5nSHXV15QqrlEau84IGejg8Y9WbWeBYR++YX/4q/07UpNWTxA==',key_name='tempest-TestNetworkBasicOps-42275247',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-fnauzlxb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:36:05Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=3ddbbdc8-3490-42e4-a549-83bfc6add71f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.235 187122 DEBUG nova.network.os_vif_util [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.236 187122 DEBUG nova.network.os_vif_util [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.236 187122 DEBUG os_vif [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.237 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.237 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.238 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.241 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.241 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb93ae7b1-0b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.242 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb93ae7b1-0b, col_values=(('external_ids', {'iface-id': 'b93ae7b1-0bfb-43d8-8b9e-c0584b9161af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:56:b3', 'vm-uuid': '3ddbbdc8-3490-42e4-a549-83bfc6add71f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.243 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:09 compute-0 NetworkManager[55697]: <info>  [1763994969.2447] manager: (tapb93ae7b1-0b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.246 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.250 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.252 187122 INFO os_vif [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b')
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.309 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.311 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.312 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:36:56:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.314 187122 INFO nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Using config drive
Nov 24 14:36:09 compute-0 nova_compute[187118]: 2025-11-24 14:36:09.889 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.314 187122 INFO nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Creating config drive at /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.config
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.318 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfy85mub execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.456 187122 DEBUG oslo_concurrency.processutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppfy85mub" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:10 compute-0 kernel: tapb93ae7b1-0b: entered promiscuous mode
Nov 24 14:36:10 compute-0 NetworkManager[55697]: <info>  [1763994970.5492] manager: (tapb93ae7b1-0b): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Nov 24 14:36:10 compute-0 ovn_controller[95613]: 2025-11-24T14:36:10Z|00118|binding|INFO|Claiming lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for this chassis.
Nov 24 14:36:10 compute-0 ovn_controller[95613]: 2025-11-24T14:36:10Z|00119|binding|INFO|b93ae7b1-0bfb-43d8-8b9e-c0584b9161af: Claiming fa:16:3e:36:56:b3 10.100.0.8
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.548 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.555 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:56:b3 10.100.0.8'], port_security=['fa:16:3e:36:56:b3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3ddbbdc8-3490-42e4-a549-83bfc6add71f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '7', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2f7949-6df8-49f4-9577-8ba82e7f7173, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.556 104469 INFO neutron.agent.ovn.metadata.agent [-] Port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af in datapath a0bdebee-864c-45a0-b54c-1e06d962d72e bound to our chassis
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.557 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a0bdebee-864c-45a0-b54c-1e06d962d72e
Nov 24 14:36:10 compute-0 ovn_controller[95613]: 2025-11-24T14:36:10Z|00120|binding|INFO|Setting lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af ovn-installed in OVS
Nov 24 14:36:10 compute-0 ovn_controller[95613]: 2025-11-24T14:36:10Z|00121|binding|INFO|Setting lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af up in Southbound
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.564 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.574 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6a1ec2-9fc7-4a78-8f02-6c39c0af7b3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.576 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa0bdebee-81 in ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.580 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.578 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa0bdebee-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.578 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a39a37dd-f459-4049-a2a7-748598ae3f71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.579 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[be5a73da-cef0-48be-842e-75cdbd245334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 systemd-udevd[217330]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.593 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[198d5e3b-e53e-4be0-9894-c2ef83edb50e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 systemd-machined[153483]: New machine qemu-9-instance-00000009.
Nov 24 14:36:10 compute-0 NetworkManager[55697]: <info>  [1763994970.6100] device (tapb93ae7b1-0b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:36:10 compute-0 NetworkManager[55697]: <info>  [1763994970.6116] device (tapb93ae7b1-0b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:36:10 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.612 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[34885576-24b9-459b-82ab-8d4af8e918ad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 podman[217312]: 2025-11-24 14:36:10.631595784 +0000 UTC m=+0.083267265 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.650 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0cd7b6-6771-41ba-93ca-d0e1ca1376eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.655 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[754e4408-5551-4d2e-8386-ad8dddd97eff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 NetworkManager[55697]: <info>  [1763994970.6591] manager: (tapa0bdebee-80): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.691 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[eccf5eb1-0987-48b8-87c8-b64d14e50381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.694 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[3f99aba8-fc2e-4967-8ae2-df9d45a085c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 NetworkManager[55697]: <info>  [1763994970.7169] device (tapa0bdebee-80): carrier: link connected
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.722 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[5b1eb8a1-fcf5-42fb-8efb-c0126c702051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.726 187122 DEBUG nova.compute.manager [req-be899885-31bd-4fbf-b485-fad4a8046d0e req-9f49f264-4f68-4f8f-a26d-88c6560a1014 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.727 187122 DEBUG oslo_concurrency.lockutils [req-be899885-31bd-4fbf-b485-fad4a8046d0e req-9f49f264-4f68-4f8f-a26d-88c6560a1014 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.727 187122 DEBUG oslo_concurrency.lockutils [req-be899885-31bd-4fbf-b485-fad4a8046d0e req-9f49f264-4f68-4f8f-a26d-88c6560a1014 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.727 187122 DEBUG oslo_concurrency.lockutils [req-be899885-31bd-4fbf-b485-fad4a8046d0e req-9f49f264-4f68-4f8f-a26d-88c6560a1014 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.727 187122 DEBUG nova.compute.manager [req-be899885-31bd-4fbf-b485-fad4a8046d0e req-9f49f264-4f68-4f8f-a26d-88c6560a1014 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Processing event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.742 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cc61c5-a1bb-4ef2-9e76-93437dbbe930]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0bdebee-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:65:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323814, 'reachable_time': 17512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217379, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.757 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc3dad2-49ca-489c-b2ef-d645b92bf331]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee0:6500'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323814, 'tstamp': 323814}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217380, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.774 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2b9d9a-8be2-4a85-80ca-d61dc2d4cb30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa0bdebee-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e0:65:00'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323814, 'reachable_time': 17512, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217381, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.806 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1a0a26-e461-466a-b645-d62ed5952cc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.845 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994955.843605, f83c0875-7d40-4037-8b77-2fea9c1fd962 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.845 187122 INFO nova.compute.manager [-] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] VM Stopped (Lifecycle Event)
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.861 187122 DEBUG nova.compute.manager [None req-dc7fbc3c-c22e-4792-98e2-9596a9f3d06e - - - - - -] [instance: f83c0875-7d40-4037-8b77-2fea9c1fd962] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.887 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[944bf47f-c2c6-47fb-887f-7740ef5abb58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.889 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0bdebee-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.890 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.890 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0bdebee-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:10 compute-0 NetworkManager[55697]: <info>  [1763994970.8937] manager: (tapa0bdebee-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Nov 24 14:36:10 compute-0 kernel: tapa0bdebee-80: entered promiscuous mode
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.892 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.897 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa0bdebee-80, col_values=(('external_ids', {'iface-id': 'f28333b2-9b09-47b9-87c5-4456a99747de'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:10 compute-0 ovn_controller[95613]: 2025-11-24T14:36:10Z|00122|binding|INFO|Releasing lport f28333b2-9b09-47b9-87c5-4456a99747de from this chassis (sb_readonly=0)
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.899 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.922 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 nova_compute[187118]: 2025-11-24 14:36:10.924 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.925 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a0bdebee-864c-45a0-b54c-1e06d962d72e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a0bdebee-864c-45a0-b54c-1e06d962d72e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.926 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[5056bd52-cd63-46cb-9e36-a926ed49f35f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.927 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-a0bdebee-864c-45a0-b54c-1e06d962d72e
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/a0bdebee-864c-45a0-b54c-1e06d962d72e.pid.haproxy
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID a0bdebee-864c-45a0-b54c-1e06d962d72e
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:36:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:10.928 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'env', 'PROCESS_TAG=haproxy-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a0bdebee-864c-45a0-b54c-1e06d962d72e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.001 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.002 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994971.0005445, 3ddbbdc8-3490-42e4-a549-83bfc6add71f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.002 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] VM Started (Lifecycle Event)
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.007 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.010 187122 INFO nova.virt.libvirt.driver [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Instance spawned successfully.
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.011 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.019 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.027 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.033 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.034 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.034 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.035 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.036 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.036 187122 DEBUG nova.virt.libvirt.driver [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.042 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.043 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994971.000777, 3ddbbdc8-3490-42e4-a549-83bfc6add71f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.043 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] VM Paused (Lifecycle Event)
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.069 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.073 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763994971.0053096, 3ddbbdc8-3490-42e4-a549-83bfc6add71f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.074 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] VM Resumed (Lifecycle Event)
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.094 187122 INFO nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Took 5.76 seconds to spawn the instance on the hypervisor.
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.094 187122 DEBUG nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.096 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.114 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.143 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.164 187122 INFO nova.compute.manager [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Took 6.22 seconds to build instance.
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.178 187122 DEBUG oslo_concurrency.lockutils [None req-6fe87441-d845-4771-b040-b6de9673d9da ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:11 compute-0 podman[217420]: 2025-11-24 14:36:11.374320792 +0000 UTC m=+0.077927277 container create 84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 24 14:36:11 compute-0 podman[217420]: 2025-11-24 14:36:11.33951984 +0000 UTC m=+0.043126295 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:36:11 compute-0 systemd[1]: Started libpod-conmon-84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24.scope.
Nov 24 14:36:11 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:36:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7d4fc800ed7fd907fc5d4daf053293eb537a136456ed6f490ea80ef42855053c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:36:11 compute-0 podman[217420]: 2025-11-24 14:36:11.476283484 +0000 UTC m=+0.179889929 container init 84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118)
Nov 24 14:36:11 compute-0 podman[217420]: 2025-11-24 14:36:11.482063853 +0000 UTC m=+0.185670298 container start 84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:36:11 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [NOTICE]   (217439) : New worker (217441) forked
Nov 24 14:36:11 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [NOTICE]   (217439) : Loading success.
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.555 187122 DEBUG nova.network.neutron [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Updated VIF entry in instance network info cache for port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.556 187122 DEBUG nova.network.neutron [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Updating instance_info_cache with network_info: [{"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:11 compute-0 nova_compute[187118]: 2025-11-24 14:36:11.577 187122 DEBUG oslo_concurrency.lockutils [req-a81d2e25-3d35-488b-bbd8-ea9692df6913 req-45788121-0c98-4e29-afa5-c78c92c5f935 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-3ddbbdc8-3490-42e4-a549-83bfc6add71f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:36:12 compute-0 nova_compute[187118]: 2025-11-24 14:36:12.796 187122 DEBUG nova.compute.manager [req-a1cc3ab3-4d2a-4ca1-9f9e-3c6b7ae4fca6 req-880dac63-ac09-4249-8423-b214225c45b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:12 compute-0 nova_compute[187118]: 2025-11-24 14:36:12.796 187122 DEBUG oslo_concurrency.lockutils [req-a1cc3ab3-4d2a-4ca1-9f9e-3c6b7ae4fca6 req-880dac63-ac09-4249-8423-b214225c45b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:12 compute-0 nova_compute[187118]: 2025-11-24 14:36:12.796 187122 DEBUG oslo_concurrency.lockutils [req-a1cc3ab3-4d2a-4ca1-9f9e-3c6b7ae4fca6 req-880dac63-ac09-4249-8423-b214225c45b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:12 compute-0 nova_compute[187118]: 2025-11-24 14:36:12.796 187122 DEBUG oslo_concurrency.lockutils [req-a1cc3ab3-4d2a-4ca1-9f9e-3c6b7ae4fca6 req-880dac63-ac09-4249-8423-b214225c45b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:12 compute-0 nova_compute[187118]: 2025-11-24 14:36:12.797 187122 DEBUG nova.compute.manager [req-a1cc3ab3-4d2a-4ca1-9f9e-3c6b7ae4fca6 req-880dac63-ac09-4249-8423-b214225c45b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] No waiting events found dispatching network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:36:12 compute-0 nova_compute[187118]: 2025-11-24 14:36:12.797 187122 WARNING nova.compute.manager [req-a1cc3ab3-4d2a-4ca1-9f9e-3c6b7ae4fca6 req-880dac63-ac09-4249-8423-b214225c45b7 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received unexpected event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for instance with vm_state active and task_state None.
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.246 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.755 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.756 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.756 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.757 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.757 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.758 187122 INFO nova.compute.manager [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Terminating instance
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.759 187122 DEBUG nova.compute.manager [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:36:14 compute-0 kernel: tapb93ae7b1-0b (unregistering): left promiscuous mode
Nov 24 14:36:14 compute-0 NetworkManager[55697]: <info>  [1763994974.7799] device (tapb93ae7b1-0b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:36:14 compute-0 ovn_controller[95613]: 2025-11-24T14:36:14Z|00123|binding|INFO|Releasing lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af from this chassis (sb_readonly=0)
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.794 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:14 compute-0 ovn_controller[95613]: 2025-11-24T14:36:14Z|00124|binding|INFO|Setting lport b93ae7b1-0bfb-43d8-8b9e-c0584b9161af down in Southbound
Nov 24 14:36:14 compute-0 ovn_controller[95613]: 2025-11-24T14:36:14Z|00125|binding|INFO|Removing iface tapb93ae7b1-0b ovn-installed in OVS
Nov 24 14:36:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:14.802 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:56:b3 10.100.0.8'], port_security=['fa:16:3e:36:56:b3 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '3ddbbdc8-3490-42e4-a549-83bfc6add71f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1455315076', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '9', 'neutron:security_group_ids': '86594553-2610-4677-ad9a-258b4f3e5a3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.187', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d2f7949-6df8-49f4-9577-8ba82e7f7173, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:36:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:14.805 104469 INFO neutron.agent.ovn.metadata.agent [-] Port b93ae7b1-0bfb-43d8-8b9e-c0584b9161af in datapath a0bdebee-864c-45a0-b54c-1e06d962d72e unbound from our chassis
Nov 24 14:36:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:14.807 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a0bdebee-864c-45a0-b54c-1e06d962d72e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:36:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:14.809 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:14.809 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b2211b00-6adc-44aa-834f-dd3ac0383f7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:14 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:14.812 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e namespace which is not needed anymore
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.825 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:14 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Nov 24 14:36:14 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 4.162s CPU time.
Nov 24 14:36:14 compute-0 systemd-machined[153483]: Machine qemu-9-instance-00000009 terminated.
Nov 24 14:36:14 compute-0 nova_compute[187118]: 2025-11-24 14:36:14.891 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:14 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [NOTICE]   (217439) : haproxy version is 2.8.14-c23fe91
Nov 24 14:36:14 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [NOTICE]   (217439) : path to executable is /usr/sbin/haproxy
Nov 24 14:36:14 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [WARNING]  (217439) : Exiting Master process...
Nov 24 14:36:14 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [WARNING]  (217439) : Exiting Master process...
Nov 24 14:36:14 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [ALERT]    (217439) : Current worker (217441) exited with code 143 (Terminated)
Nov 24 14:36:14 compute-0 neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e[217435]: [WARNING]  (217439) : All workers exited. Exiting... (0)
Nov 24 14:36:14 compute-0 systemd[1]: libpod-84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24.scope: Deactivated successfully.
Nov 24 14:36:14 compute-0 podman[217474]: 2025-11-24 14:36:14.942550933 +0000 UTC m=+0.042674262 container died 84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_managed=true)
Nov 24 14:36:14 compute-0 systemd[1]: var-lib-containers-storage-overlay-7d4fc800ed7fd907fc5d4daf053293eb537a136456ed6f490ea80ef42855053c-merged.mount: Deactivated successfully.
Nov 24 14:36:14 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24-userdata-shm.mount: Deactivated successfully.
Nov 24 14:36:14 compute-0 podman[217474]: 2025-11-24 14:36:14.981572042 +0000 UTC m=+0.081695341 container cleanup 84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 24 14:36:14 compute-0 systemd[1]: libpod-conmon-84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24.scope: Deactivated successfully.
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.019 187122 INFO nova.virt.libvirt.driver [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Instance destroyed successfully.
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.020 187122 DEBUG nova.objects.instance [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 3ddbbdc8-3490-42e4-a549-83bfc6add71f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.028 187122 DEBUG nova.virt.libvirt.vif [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:36:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1930886610',display_name='tempest-TestNetworkBasicOps-server-1930886610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1930886610',id=9,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMId+ftUDRTIwWkXbF9KcQfD3bQ44HDk8CIn6Usq0+Jtj3QL7HYWS3ChFv+RnxLkj4HoliaP5H5BqbqAR5nSHXV15QqrlEau84IGejg8Y9WbWeBYR++YX/4q/07UpNWTxA==',key_name='tempest-TestNetworkBasicOps-42275247',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:36:11Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-fnauzlxb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:36:11Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=3ddbbdc8-3490-42e4-a549-83bfc6add71f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.029 187122 DEBUG nova.network.os_vif_util [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "address": "fa:16:3e:36:56:b3", "network": {"id": "a0bdebee-864c-45a0-b54c-1e06d962d72e", "bridge": "br-int", "label": "tempest-network-smoke--1420093423", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb93ae7b1-0b", "ovs_interfaceid": "b93ae7b1-0bfb-43d8-8b9e-c0584b9161af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.029 187122 DEBUG nova.network.os_vif_util [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.030 187122 DEBUG os_vif [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.031 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.031 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb93ae7b1-0b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.032 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.033 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.035 187122 INFO os_vif [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:56:b3,bridge_name='br-int',has_traffic_filtering=True,id=b93ae7b1-0bfb-43d8-8b9e-c0584b9161af,network=Network(a0bdebee-864c-45a0-b54c-1e06d962d72e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapb93ae7b1-0b')
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.036 187122 INFO nova.virt.libvirt.driver [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Deleting instance files /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f_del
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.036 187122 INFO nova.virt.libvirt.driver [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Deletion of /var/lib/nova/instances/3ddbbdc8-3490-42e4-a549-83bfc6add71f_del complete
Nov 24 14:36:15 compute-0 podman[217509]: 2025-11-24 14:36:15.055698173 +0000 UTC m=+0.051366442 container remove 84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.060 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a36aa9ce-38cb-400e-89ac-8237aaa8b75a]: (4, ('Mon Nov 24 02:36:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e (84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24)\n84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24\nMon Nov 24 02:36:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e (84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24)\n84468bbc159779e31a8cf5cd236aa99b4e8acc69f0de52c77c37f9b1d7228a24\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.061 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[26978a63-afc3-4c42-96d9-14afa2edc34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.062 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0bdebee-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:15 compute-0 kernel: tapa0bdebee-80: left promiscuous mode
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.064 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.075 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.076 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ab7c70c4-4edd-460d-aba8-908517aa7f84]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.078 187122 INFO nova.compute.manager [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Took 0.32 seconds to destroy the instance on the hypervisor.
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.079 187122 DEBUG oslo.service.loopingcall [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.079 187122 DEBUG nova.compute.manager [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.079 187122 DEBUG nova.network.neutron [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.092 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9ef17b-36f4-4512-b609-93aea55a3c60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.093 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a419d200-f4ff-4c03-a805-ec8f5daa534f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.107 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ea02b1a6-d7e5-4eab-822d-06fd6cdf7390]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323807, 'reachable_time': 22228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217534, 'error': None, 'target': 'ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.109 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a0bdebee-864c-45a0-b54c-1e06d962d72e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:36:15 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:15.109 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[48e0dcec-bb25-44b1-b918-4698361a17cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:15 compute-0 systemd[1]: run-netns-ovnmeta\x2da0bdebee\x2d864c\x2d45a0\x2db54c\x2d1e06d962d72e.mount: Deactivated successfully.
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.434 187122 DEBUG nova.compute.manager [req-315f4a3a-b4c8-4848-889f-3a807756741c req-157ab62a-fb4e-46ed-96ef-54549ed9fae2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received event network-vif-unplugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.434 187122 DEBUG oslo_concurrency.lockutils [req-315f4a3a-b4c8-4848-889f-3a807756741c req-157ab62a-fb4e-46ed-96ef-54549ed9fae2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.434 187122 DEBUG oslo_concurrency.lockutils [req-315f4a3a-b4c8-4848-889f-3a807756741c req-157ab62a-fb4e-46ed-96ef-54549ed9fae2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.435 187122 DEBUG oslo_concurrency.lockutils [req-315f4a3a-b4c8-4848-889f-3a807756741c req-157ab62a-fb4e-46ed-96ef-54549ed9fae2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.435 187122 DEBUG nova.compute.manager [req-315f4a3a-b4c8-4848-889f-3a807756741c req-157ab62a-fb4e-46ed-96ef-54549ed9fae2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] No waiting events found dispatching network-vif-unplugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:36:15 compute-0 nova_compute[187118]: 2025-11-24 14:36:15.435 187122 DEBUG nova.compute.manager [req-315f4a3a-b4c8-4848-889f-3a807756741c req-157ab62a-fb4e-46ed-96ef-54549ed9fae2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received event network-vif-unplugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.119 187122 DEBUG nova.network.neutron [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.132 187122 INFO nova.compute.manager [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Took 1.05 seconds to deallocate network for instance.
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.176 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.177 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.244 187122 DEBUG nova.compute.provider_tree [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.258 187122 DEBUG nova.scheduler.client.report [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.286 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.316 187122 INFO nova.scheduler.client.report [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 3ddbbdc8-3490-42e4-a549-83bfc6add71f
Nov 24 14:36:16 compute-0 nova_compute[187118]: 2025-11-24 14:36:16.375 187122 DEBUG oslo_concurrency.lockutils [None req-d8fa17c0-802b-4f46-a4a6-2b53a92e14d1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:17 compute-0 podman[217535]: 2025-11-24 14:36:17.447335821 +0000 UTC m=+0.053831219 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 14:36:17 compute-0 nova_compute[187118]: 2025-11-24 14:36:17.552 187122 DEBUG nova.compute.manager [req-7e0592dd-d183-4114-abac-88c6ad6de489 req-dddc7340-edf8-4103-be1a-76b58cd4d7a9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:17 compute-0 nova_compute[187118]: 2025-11-24 14:36:17.552 187122 DEBUG oslo_concurrency.lockutils [req-7e0592dd-d183-4114-abac-88c6ad6de489 req-dddc7340-edf8-4103-be1a-76b58cd4d7a9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:17 compute-0 nova_compute[187118]: 2025-11-24 14:36:17.552 187122 DEBUG oslo_concurrency.lockutils [req-7e0592dd-d183-4114-abac-88c6ad6de489 req-dddc7340-edf8-4103-be1a-76b58cd4d7a9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:17 compute-0 nova_compute[187118]: 2025-11-24 14:36:17.553 187122 DEBUG oslo_concurrency.lockutils [req-7e0592dd-d183-4114-abac-88c6ad6de489 req-dddc7340-edf8-4103-be1a-76b58cd4d7a9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "3ddbbdc8-3490-42e4-a549-83bfc6add71f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:17 compute-0 nova_compute[187118]: 2025-11-24 14:36:17.553 187122 DEBUG nova.compute.manager [req-7e0592dd-d183-4114-abac-88c6ad6de489 req-dddc7340-edf8-4103-be1a-76b58cd4d7a9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] No waiting events found dispatching network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:36:17 compute-0 nova_compute[187118]: 2025-11-24 14:36:17.553 187122 WARNING nova.compute.manager [req-7e0592dd-d183-4114-abac-88c6ad6de489 req-dddc7340-edf8-4103-be1a-76b58cd4d7a9 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Received unexpected event network-vif-plugged-b93ae7b1-0bfb-43d8-8b9e-c0584b9161af for instance with vm_state deleted and task_state None.
Nov 24 14:36:19 compute-0 nova_compute[187118]: 2025-11-24 14:36:19.893 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:20 compute-0 nova_compute[187118]: 2025-11-24 14:36:20.032 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:20 compute-0 nova_compute[187118]: 2025-11-24 14:36:20.433 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:20 compute-0 nova_compute[187118]: 2025-11-24 14:36:20.539 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:21 compute-0 podman[217557]: 2025-11-24 14:36:21.467592097 +0000 UTC m=+0.071486818 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 14:36:21 compute-0 podman[217556]: 2025-11-24 14:36:21.48535981 +0000 UTC m=+0.085080446 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 24 14:36:24 compute-0 nova_compute[187118]: 2025-11-24 14:36:24.904 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:25 compute-0 nova_compute[187118]: 2025-11-24 14:36:25.037 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:26 compute-0 podman[217599]: 2025-11-24 14:36:26.531584981 +0000 UTC m=+0.120835624 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 24 14:36:26 compute-0 podman[217598]: 2025-11-24 14:36:26.551720207 +0000 UTC m=+0.158875696 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 24 14:36:29 compute-0 nova_compute[187118]: 2025-11-24 14:36:29.905 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:30 compute-0 nova_compute[187118]: 2025-11-24 14:36:30.017 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763994975.0159976, 3ddbbdc8-3490-42e4-a549-83bfc6add71f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:30 compute-0 nova_compute[187118]: 2025-11-24 14:36:30.017 187122 INFO nova.compute.manager [-] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] VM Stopped (Lifecycle Event)
Nov 24 14:36:30 compute-0 nova_compute[187118]: 2025-11-24 14:36:30.037 187122 DEBUG nova.compute.manager [None req-e2dc5dd7-af66-4a7c-835f-fd6e5aaa1810 - - - - - -] [instance: 3ddbbdc8-3490-42e4-a549-83bfc6add71f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:30 compute-0 nova_compute[187118]: 2025-11-24 14:36:30.039 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:33 compute-0 podman[217642]: 2025-11-24 14:36:33.443538149 +0000 UTC m=+0.052614117 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:36:34 compute-0 nova_compute[187118]: 2025-11-24 14:36:34.911 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:35 compute-0 nova_compute[187118]: 2025-11-24 14:36:35.041 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.657 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.658 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.672 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.741 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.741 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.749 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.750 187122 INFO nova.compute.claims [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.832 187122 DEBUG nova.compute.provider_tree [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.842 187122 DEBUG nova.scheduler.client.report [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.856 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.856 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.888 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.889 187122 DEBUG nova.network.neutron [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.905 187122 INFO nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.919 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.989 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.990 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.990 187122 INFO nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Creating image(s)
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.991 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.991 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:36 compute-0 nova_compute[187118]: 2025-11-24 14:36:36.992 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.003 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.054 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.056 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.056 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.067 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.118 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.120 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.159 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.160 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.161 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.232 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.234 187122 DEBUG nova.virt.disk.api [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.235 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.324 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.326 187122 DEBUG nova.virt.disk.api [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.326 187122 DEBUG nova.objects.instance [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 9c8d60c6-3fd4-44f5-bb06-16da6c642889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.333 187122 DEBUG nova.policy [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.357 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.358 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Ensure instance console log exists: /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.359 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.360 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.360 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:37 compute-0 nova_compute[187118]: 2025-11-24 14:36:37.942 187122 DEBUG nova.network.neutron [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Successfully created port: 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.791 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.850 187122 DEBUG nova.network.neutron [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Successfully updated port: 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.869 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.869 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.869 187122 DEBUG nova.network.neutron [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.953 187122 DEBUG nova.compute.manager [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-changed-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.953 187122 DEBUG nova.compute.manager [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Refreshing instance network info cache due to event network-changed-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:36:38 compute-0 nova_compute[187118]: 2025-11-24 14:36:38.954 187122 DEBUG oslo_concurrency.lockutils [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:36:39 compute-0 nova_compute[187118]: 2025-11-24 14:36:39.023 187122 DEBUG nova.network.neutron [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:36:39 compute-0 nova_compute[187118]: 2025-11-24 14:36:39.914 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.040 187122 DEBUG nova.network.neutron [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updating instance_info_cache with network_info: [{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.043 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.059 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.059 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Instance network_info: |[{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.059 187122 DEBUG oslo_concurrency.lockutils [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.060 187122 DEBUG nova.network.neutron [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Refreshing network info cache for port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.062 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Start _get_guest_xml network_info=[{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.068 187122 WARNING nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.072 187122 DEBUG nova.virt.libvirt.host [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.072 187122 DEBUG nova.virt.libvirt.host [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.078 187122 DEBUG nova.virt.libvirt.host [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.078 187122 DEBUG nova.virt.libvirt.host [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.078 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.079 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.079 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.079 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.080 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.080 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.080 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.080 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.080 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.081 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.081 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.081 187122 DEBUG nova.virt.hardware [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.084 187122 DEBUG nova.virt.libvirt.vif [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:36:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2095012351',display_name='tempest-TestNetworkBasicOps-server-2095012351',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2095012351',id=10,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5hfbIEchPMF2ULpdg1raFpV7onF6hG/g7OdGYdHdMpPZtInGWK1Y8d1+9yJeeQFsR/dyF5wucQsgS2RZTgTXRlKyqp5FugKEeWtC4NVoeCSbYWD2ntqsY7ovakHvDWLg==',key_name='tempest-TestNetworkBasicOps-1102787144',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-y2s2r3gc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:36:36Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9c8d60c6-3fd4-44f5-bb06-16da6c642889,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.085 187122 DEBUG nova.network.os_vif_util [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.085 187122 DEBUG nova.network.os_vif_util [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.086 187122 DEBUG nova.objects.instance [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9c8d60c6-3fd4-44f5-bb06-16da6c642889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.097 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <uuid>9c8d60c6-3fd4-44f5-bb06-16da6c642889</uuid>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <name>instance-0000000a</name>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-2095012351</nova:name>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:36:40</nova:creationTime>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         <nova:port uuid="6ac8169b-b76a-43c1-8baa-2e6aa1db7a50">
Nov 24 14:36:40 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <system>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <entry name="serial">9c8d60c6-3fd4-44f5-bb06-16da6c642889</entry>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <entry name="uuid">9c8d60c6-3fd4-44f5-bb06-16da6c642889</entry>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </system>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <os>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </os>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <features>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </features>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.config"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:a3:64:93"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <target dev="tap6ac8169b-b7"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/console.log" append="off"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <video>
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </video>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:36:40 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:36:40 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:36:40 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:36:40 compute-0 nova_compute[187118]: </domain>
Nov 24 14:36:40 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.098 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Preparing to wait for external event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.098 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.099 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.099 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.099 187122 DEBUG nova.virt.libvirt.vif [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:36:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2095012351',display_name='tempest-TestNetworkBasicOps-server-2095012351',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2095012351',id=10,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5hfbIEchPMF2ULpdg1raFpV7onF6hG/g7OdGYdHdMpPZtInGWK1Y8d1+9yJeeQFsR/dyF5wucQsgS2RZTgTXRlKyqp5FugKEeWtC4NVoeCSbYWD2ntqsY7ovakHvDWLg==',key_name='tempest-TestNetworkBasicOps-1102787144',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-y2s2r3gc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:36:36Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9c8d60c6-3fd4-44f5-bb06-16da6c642889,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.100 187122 DEBUG nova.network.os_vif_util [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.100 187122 DEBUG nova.network.os_vif_util [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.100 187122 DEBUG os_vif [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.101 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.101 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.101 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.103 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.103 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ac8169b-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.104 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6ac8169b-b7, col_values=(('external_ids', {'iface-id': '6ac8169b-b76a-43c1-8baa-2e6aa1db7a50', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:64:93', 'vm-uuid': '9c8d60c6-3fd4-44f5-bb06-16da6c642889'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.105 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 NetworkManager[55697]: <info>  [1763995000.1059] manager: (tap6ac8169b-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.107 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.112 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.112 187122 INFO os_vif [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7')
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.166 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.167 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.167 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:a3:64:93, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.167 187122 INFO nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Using config drive
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.513 187122 INFO nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Creating config drive at /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.config
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.524 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppdpbst1m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.670 187122 DEBUG oslo_concurrency.processutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppdpbst1m" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:40 compute-0 kernel: tap6ac8169b-b7: entered promiscuous mode
Nov 24 14:36:40 compute-0 NetworkManager[55697]: <info>  [1763995000.7960] manager: (tap6ac8169b-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Nov 24 14:36:40 compute-0 ovn_controller[95613]: 2025-11-24T14:36:40Z|00126|binding|INFO|Claiming lport 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 for this chassis.
Nov 24 14:36:40 compute-0 ovn_controller[95613]: 2025-11-24T14:36:40Z|00127|binding|INFO|6ac8169b-b76a-43c1-8baa-2e6aa1db7a50: Claiming fa:16:3e:a3:64:93 10.100.0.5
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.798 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.799 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.806 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 systemd-udevd[217709]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.838 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.839 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.839 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.839 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.838 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:64:93 10.100.0.5'], port_security=['fa:16:3e:a3:64:93 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9c8d60c6-3fd4-44f5-bb06-16da6c642889', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '073514a9-c9b8-47b9-a740-4f6ccfdf4a5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bace45b9-4501-4b0e-af9e-56715830300b, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.839 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 in datapath efaca8b4-60bb-4ba9-b254-02fcfdeb3298 bound to our chassis
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.840 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network efaca8b4-60bb-4ba9-b254-02fcfdeb3298
Nov 24 14:36:40 compute-0 NetworkManager[55697]: <info>  [1763995000.8491] device (tap6ac8169b-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:36:40 compute-0 NetworkManager[55697]: <info>  [1763995000.8503] device (tap6ac8169b-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.855 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[94b875d6-9df0-40ff-98b5-0f5c6ae04d64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.855 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapefaca8b4-61 in ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.857 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapefaca8b4-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.857 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7c618fbe-305a-40ce-bc5b-9197e78ed32a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.858 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[c7295d19-69b8-404a-9c9e-e3325daf1fc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.873 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[08b5e661-c43d-452b-8217-f18393179fe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 podman[217695]: 2025-11-24 14:36:40.883959169 +0000 UTC m=+0.091614085 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.899 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 systemd-machined[153483]: New machine qemu-10-instance-0000000a.
Nov 24 14:36:40 compute-0 ovn_controller[95613]: 2025-11-24T14:36:40Z|00128|binding|INFO|Setting lport 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 ovn-installed in OVS
Nov 24 14:36:40 compute-0 ovn_controller[95613]: 2025-11-24T14:36:40Z|00129|binding|INFO|Setting lport 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 up in Southbound
Nov 24 14:36:40 compute-0 nova_compute[187118]: 2025-11-24 14:36:40.904 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.908 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[452afc5d-d89e-40fe-b070-88469edbb9a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.947 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[049f19fa-ecde-4873-b802-a7d2c052e3ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.954 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[86d75569-3cad-4f96-982a-4d57bdb57265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 NetworkManager[55697]: <info>  [1763995000.9557] manager: (tapefaca8b4-60): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.990 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[4729d14b-dcb3-492a-8b45-bf4557acc742]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:40.994 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[2b63823f-31da-4c98-b1dc-10a0de7c58fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.014 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:41 compute-0 NetworkManager[55697]: <info>  [1763995001.0279] device (tapefaca8b4-60): carrier: link connected
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.035 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[eb38886f-c3dc-4b31-ab2b-5e7ddf0d2735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.053 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[da420f0e-322b-4c4c-891d-a265dd56efb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefaca8b4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:86:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326845, 'reachable_time': 40142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217757, 'error': None, 'target': 'ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.074 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[96cbd47a-3333-497f-975a-dc011bee06a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe43:8605'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 326845, 'tstamp': 326845}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217758, 'error': None, 'target': 'ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.102 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.103 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.103 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[804adca1-6595-45e7-8e3f-603952d6290b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefaca8b4-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:43:86:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326845, 'reachable_time': 40142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217759, 'error': None, 'target': 'ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.127 187122 DEBUG nova.compute.manager [req-6840262a-c3c6-4645-816b-4d46631ec34c req-4b6e4cae-9ad1-4a9f-8ed8-7fc290e17fb2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.128 187122 DEBUG oslo_concurrency.lockutils [req-6840262a-c3c6-4645-816b-4d46631ec34c req-4b6e4cae-9ad1-4a9f-8ed8-7fc290e17fb2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.129 187122 DEBUG oslo_concurrency.lockutils [req-6840262a-c3c6-4645-816b-4d46631ec34c req-4b6e4cae-9ad1-4a9f-8ed8-7fc290e17fb2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.129 187122 DEBUG oslo_concurrency.lockutils [req-6840262a-c3c6-4645-816b-4d46631ec34c req-4b6e4cae-9ad1-4a9f-8ed8-7fc290e17fb2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.129 187122 DEBUG nova.compute.manager [req-6840262a-c3c6-4645-816b-4d46631ec34c req-4b6e4cae-9ad1-4a9f-8ed8-7fc290e17fb2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Processing event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.156 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[67b436eb-c006-4ccd-bde2-02230dd04096]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.173 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.270 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a67868cb-49c0-419d-b9a0-f2d41c6e73a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.271 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefaca8b4-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.271 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.272 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefaca8b4-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.273 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:41 compute-0 NetworkManager[55697]: <info>  [1763995001.2746] manager: (tapefaca8b4-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Nov 24 14:36:41 compute-0 kernel: tapefaca8b4-60: entered promiscuous mode
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.278 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.279 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapefaca8b4-60, col_values=(('external_ids', {'iface-id': 'dc553ba5-d5f1-490c-8e4b-b7f3ef2ea42c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.280 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:41 compute-0 ovn_controller[95613]: 2025-11-24T14:36:41Z|00130|binding|INFO|Releasing lport dc553ba5-d5f1-490c-8e4b-b7f3ef2ea42c from this chassis (sb_readonly=0)
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.304 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.306 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/efaca8b4-60bb-4ba9-b254-02fcfdeb3298.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/efaca8b4-60bb-4ba9-b254-02fcfdeb3298.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.307 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d341e68d-7c64-402f-8efe-d63501f20132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.308 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-efaca8b4-60bb-4ba9-b254-02fcfdeb3298
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/efaca8b4-60bb-4ba9-b254-02fcfdeb3298.pid.haproxy
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID efaca8b4-60bb-4ba9-b254-02fcfdeb3298
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:36:41 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:41.308 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'env', 'PROCESS_TAG=haproxy-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/efaca8b4-60bb-4ba9-b254-02fcfdeb3298.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.366 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.367 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5754MB free_disk=73.45812225341797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.367 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.367 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.446 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 9c8d60c6-3fd4-44f5-bb06-16da6c642889 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.446 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.447 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.477 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995001.4767447, 9c8d60c6-3fd4-44f5-bb06-16da6c642889 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.477 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] VM Started (Lifecycle Event)
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.478 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.483 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.488 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.490 187122 INFO nova.virt.libvirt.driver [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Instance spawned successfully.
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.490 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.494 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.497 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.515 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.515 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.516 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.516 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.517 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.517 187122 DEBUG nova.virt.libvirt.driver [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.521 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.523 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.523 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995001.477324, 9c8d60c6-3fd4-44f5-bb06-16da6c642889 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.523 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] VM Paused (Lifecycle Event)
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.554 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.554 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.572 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.581 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995001.4813666, 9c8d60c6-3fd4-44f5-bb06-16da6c642889 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.581 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] VM Resumed (Lifecycle Event)
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.585 187122 INFO nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Took 4.60 seconds to spawn the instance on the hypervisor.
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.585 187122 DEBUG nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.596 187122 DEBUG nova.network.neutron [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updated VIF entry in instance network info cache for port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.597 187122 DEBUG nova.network.neutron [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updating instance_info_cache with network_info: [{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.611 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.617 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.618 187122 DEBUG oslo_concurrency.lockutils [req-028a0e42-8b55-48de-8ace-43232c0384d6 req-e6990713-fb15-4109-a8d5-93f60d76f658 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.649 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.664 187122 INFO nova.compute.manager [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Took 4.95 seconds to build instance.
Nov 24 14:36:41 compute-0 nova_compute[187118]: 2025-11-24 14:36:41.681 187122 DEBUG oslo_concurrency.lockutils [None req-4fa90237-15a0-4481-9070-72d2da1dfc63 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.023s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:41 compute-0 podman[217803]: 2025-11-24 14:36:41.799265162 +0000 UTC m=+0.071848919 container create 274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0)
Nov 24 14:36:41 compute-0 systemd[1]: Started libpod-conmon-274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1.scope.
Nov 24 14:36:41 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:36:41 compute-0 podman[217803]: 2025-11-24 14:36:41.775145994 +0000 UTC m=+0.047729771 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:36:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bdc3aff5f5c5be7b07bc814d25abd71057cb2b9b755613ac6973655460a6ac5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:36:41 compute-0 podman[217803]: 2025-11-24 14:36:41.886789313 +0000 UTC m=+0.159373090 container init 274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 14:36:41 compute-0 podman[217803]: 2025-11-24 14:36:41.892057689 +0000 UTC m=+0.164641446 container start 274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:36:41 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [NOTICE]   (217822) : New worker (217824) forked
Nov 24 14:36:41 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [NOTICE]   (217822) : Loading success.
Nov 24 14:36:42 compute-0 nova_compute[187118]: 2025-11-24 14:36:42.553 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:42 compute-0 nova_compute[187118]: 2025-11-24 14:36:42.554 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:36:42 compute-0 nova_compute[187118]: 2025-11-24 14:36:42.554 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.198 187122 DEBUG nova.compute.manager [req-8f48e0cb-93b4-4c07-82ae-481dbdc79790 req-6b92cf5d-bd4c-46f6-a3d8-0b546b413e7c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.199 187122 DEBUG oslo_concurrency.lockutils [req-8f48e0cb-93b4-4c07-82ae-481dbdc79790 req-6b92cf5d-bd4c-46f6-a3d8-0b546b413e7c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.199 187122 DEBUG oslo_concurrency.lockutils [req-8f48e0cb-93b4-4c07-82ae-481dbdc79790 req-6b92cf5d-bd4c-46f6-a3d8-0b546b413e7c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.200 187122 DEBUG oslo_concurrency.lockutils [req-8f48e0cb-93b4-4c07-82ae-481dbdc79790 req-6b92cf5d-bd4c-46f6-a3d8-0b546b413e7c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.200 187122 DEBUG nova.compute.manager [req-8f48e0cb-93b4-4c07-82ae-481dbdc79790 req-6b92cf5d-bd4c-46f6-a3d8-0b546b413e7c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] No waiting events found dispatching network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.201 187122 WARNING nova.compute.manager [req-8f48e0cb-93b4-4c07-82ae-481dbdc79790 req-6b92cf5d-bd4c-46f6-a3d8-0b546b413e7c 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received unexpected event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 for instance with vm_state active and task_state None.
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.311 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.311 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquired lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.312 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 14:36:43 compute-0 nova_compute[187118]: 2025-11-24 14:36:43.312 187122 DEBUG nova.objects.instance [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9c8d60c6-3fd4-44f5-bb06-16da6c642889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:36:44 compute-0 nova_compute[187118]: 2025-11-24 14:36:44.919 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.105 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.497 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updating instance_info_cache with network_info: [{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.521 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Releasing lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.522 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.523 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.523 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.523 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:45 compute-0 nova_compute[187118]: 2025-11-24 14:36:45.524 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:36:46 compute-0 ovn_controller[95613]: 2025-11-24T14:36:46Z|00131|binding|INFO|Releasing lport dc553ba5-d5f1-490c-8e4b-b7f3ef2ea42c from this chassis (sb_readonly=0)
Nov 24 14:36:46 compute-0 NetworkManager[55697]: <info>  [1763995006.2832] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Nov 24 14:36:46 compute-0 NetworkManager[55697]: <info>  [1763995006.2842] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.283 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:46 compute-0 ovn_controller[95613]: 2025-11-24T14:36:46Z|00132|binding|INFO|Releasing lport dc553ba5-d5f1-490c-8e4b-b7f3ef2ea42c from this chassis (sb_readonly=0)
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.337 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.346 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.709 187122 DEBUG nova.compute.manager [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-changed-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.710 187122 DEBUG nova.compute.manager [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Refreshing instance network info cache due to event network-changed-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.711 187122 DEBUG oslo_concurrency.lockutils [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.712 187122 DEBUG oslo_concurrency.lockutils [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.712 187122 DEBUG nova.network.neutron [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Refreshing network info cache for port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.761 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:46 compute-0 nova_compute[187118]: 2025-11-24 14:36:46.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:36:48 compute-0 nova_compute[187118]: 2025-11-24 14:36:48.338 187122 DEBUG nova.network.neutron [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updated VIF entry in instance network info cache for port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:36:48 compute-0 nova_compute[187118]: 2025-11-24 14:36:48.340 187122 DEBUG nova.network.neutron [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updating instance_info_cache with network_info: [{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:36:48 compute-0 nova_compute[187118]: 2025-11-24 14:36:48.355 187122 DEBUG oslo_concurrency.lockutils [req-1e5e6b5c-4bb8-4623-9119-e34cb06c801a req-624f60de-efa1-4607-bc6f-3b49fdca23a8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:36:48 compute-0 podman[217834]: 2025-11-24 14:36:48.473144335 +0000 UTC m=+0.071194151 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 24 14:36:49 compute-0 nova_compute[187118]: 2025-11-24 14:36:49.922 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:50 compute-0 nova_compute[187118]: 2025-11-24 14:36:50.108 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:52 compute-0 podman[217854]: 2025-11-24 14:36:52.469020407 +0000 UTC m=+0.080468978 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:36:52 compute-0 podman[217855]: 2025-11-24 14:36:52.500191319 +0000 UTC m=+0.097768256 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:36:53 compute-0 ovn_controller[95613]: 2025-11-24T14:36:53Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:64:93 10.100.0.5
Nov 24 14:36:53 compute-0 ovn_controller[95613]: 2025-11-24T14:36:53Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:64:93 10.100.0.5
Nov 24 14:36:54 compute-0 nova_compute[187118]: 2025-11-24 14:36:54.923 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:55 compute-0 nova_compute[187118]: 2025-11-24 14:36:55.110 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:36:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:56.663 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:36:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:56.664 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:36:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:36:56.665 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:36:57 compute-0 podman[217904]: 2025-11-24 14:36:57.485233478 +0000 UTC m=+0.084983312 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 14:36:57 compute-0 podman[217903]: 2025-11-24 14:36:57.51998051 +0000 UTC m=+0.122827080 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:36:59 compute-0 nova_compute[187118]: 2025-11-24 14:36:59.188 187122 INFO nova.compute.manager [None req-e5d52f10-2f3d-476d-9428-512224f1622a ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Get console output
Nov 24 14:36:59 compute-0 nova_compute[187118]: 2025-11-24 14:36:59.192 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:36:59 compute-0 nova_compute[187118]: 2025-11-24 14:36:59.925 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:00 compute-0 nova_compute[187118]: 2025-11-24 14:37:00.111 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:00 compute-0 ovn_controller[95613]: 2025-11-24T14:37:00Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:64:93 10.100.0.5
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.549 187122 DEBUG nova.compute.manager [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-changed-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.550 187122 DEBUG nova.compute.manager [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Refreshing instance network info cache due to event network-changed-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.550 187122 DEBUG oslo_concurrency.lockutils [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.550 187122 DEBUG oslo_concurrency.lockutils [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.551 187122 DEBUG nova.network.neutron [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Refreshing network info cache for port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.653 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.654 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.654 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.655 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.655 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.657 187122 INFO nova.compute.manager [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Terminating instance
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.659 187122 DEBUG nova.compute.manager [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:37:02 compute-0 kernel: tap6ac8169b-b7 (unregistering): left promiscuous mode
Nov 24 14:37:02 compute-0 NetworkManager[55697]: <info>  [1763995022.6904] device (tap6ac8169b-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:37:02 compute-0 ovn_controller[95613]: 2025-11-24T14:37:02Z|00133|binding|INFO|Releasing lport 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 from this chassis (sb_readonly=0)
Nov 24 14:37:02 compute-0 ovn_controller[95613]: 2025-11-24T14:37:02Z|00134|binding|INFO|Setting lport 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 down in Southbound
Nov 24 14:37:02 compute-0 ovn_controller[95613]: 2025-11-24T14:37:02Z|00135|binding|INFO|Removing iface tap6ac8169b-b7 ovn-installed in OVS
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.701 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:02 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:02.710 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:64:93 10.100.0.5'], port_security=['fa:16:3e:a3:64:93 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '9c8d60c6-3fd4-44f5-bb06-16da6c642889', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '073514a9-c9b8-47b9-a740-4f6ccfdf4a5e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bace45b9-4501-4b0e-af9e-56715830300b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:37:02 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:02.713 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 in datapath efaca8b4-60bb-4ba9-b254-02fcfdeb3298 unbound from our chassis
Nov 24 14:37:02 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:02.715 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network efaca8b4-60bb-4ba9-b254-02fcfdeb3298, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.716 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:02 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:02.717 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b0423b94-4f8d-4093-b6cc-99001819ef34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:02 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:02.718 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298 namespace which is not needed anymore
Nov 24 14:37:02 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Nov 24 14:37:02 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 13.296s CPU time.
Nov 24 14:37:02 compute-0 systemd-machined[153483]: Machine qemu-10-instance-0000000a terminated.
Nov 24 14:37:02 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [NOTICE]   (217822) : haproxy version is 2.8.14-c23fe91
Nov 24 14:37:02 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [NOTICE]   (217822) : path to executable is /usr/sbin/haproxy
Nov 24 14:37:02 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [WARNING]  (217822) : Exiting Master process...
Nov 24 14:37:02 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [WARNING]  (217822) : Exiting Master process...
Nov 24 14:37:02 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [ALERT]    (217822) : Current worker (217824) exited with code 143 (Terminated)
Nov 24 14:37:02 compute-0 neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298[217818]: [WARNING]  (217822) : All workers exited. Exiting... (0)
Nov 24 14:37:02 compute-0 systemd[1]: libpod-274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1.scope: Deactivated successfully.
Nov 24 14:37:02 compute-0 podman[217972]: 2025-11-24 14:37:02.890263497 +0000 UTC m=+0.080513009 container died 274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:37:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1-userdata-shm.mount: Deactivated successfully.
Nov 24 14:37:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-bdc3aff5f5c5be7b07bc814d25abd71057cb2b9b755613ac6973655460a6ac5f-merged.mount: Deactivated successfully.
Nov 24 14:37:02 compute-0 podman[217972]: 2025-11-24 14:37:02.935132238 +0000 UTC m=+0.125381740 container cleanup 274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.938 187122 INFO nova.virt.libvirt.driver [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Instance destroyed successfully.
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.939 187122 DEBUG nova.objects.instance [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 9c8d60c6-3fd4-44f5-bb06-16da6c642889 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:37:02 compute-0 systemd[1]: libpod-conmon-274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1.scope: Deactivated successfully.
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.952 187122 DEBUG nova.virt.libvirt.vif [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:36:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2095012351',display_name='tempest-TestNetworkBasicOps-server-2095012351',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2095012351',id=10,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5hfbIEchPMF2ULpdg1raFpV7onF6hG/g7OdGYdHdMpPZtInGWK1Y8d1+9yJeeQFsR/dyF5wucQsgS2RZTgTXRlKyqp5FugKEeWtC4NVoeCSbYWD2ntqsY7ovakHvDWLg==',key_name='tempest-TestNetworkBasicOps-1102787144',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:36:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-y2s2r3gc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:36:41Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=9c8d60c6-3fd4-44f5-bb06-16da6c642889,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.953 187122 DEBUG nova.network.os_vif_util [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.953 187122 DEBUG nova.network.os_vif_util [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.954 187122 DEBUG os_vif [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.956 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.956 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ac8169b-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.958 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.959 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.961 187122 INFO os_vif [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:64:93,bridge_name='br-int',has_traffic_filtering=True,id=6ac8169b-b76a-43c1-8baa-2e6aa1db7a50,network=Network(efaca8b4-60bb-4ba9-b254-02fcfdeb3298),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac8169b-b7')
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.961 187122 INFO nova.virt.libvirt.driver [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Deleting instance files /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889_del
Nov 24 14:37:02 compute-0 nova_compute[187118]: 2025-11-24 14:37:02.962 187122 INFO nova.virt.libvirt.driver [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Deletion of /var/lib/nova/instances/9c8d60c6-3fd4-44f5-bb06-16da6c642889_del complete
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.012 187122 INFO nova.compute.manager [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Took 0.35 seconds to destroy the instance on the hypervisor.
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.013 187122 DEBUG oslo.service.loopingcall [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.013 187122 DEBUG nova.compute.manager [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.013 187122 DEBUG nova.network.neutron [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:37:03 compute-0 podman[218019]: 2025-11-24 14:37:03.076873249 +0000 UTC m=+0.119412424 container remove 274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.085 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[d76bd524-45a6-48bf-8e28-4291e67e4475]: (4, ('Mon Nov 24 02:37:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298 (274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1)\n274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1\nMon Nov 24 02:37:02 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298 (274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1)\n274b12d6979b8c6ec8d6ce9465fd732ad9c5a3e06f601eabd3d4a4d3d611ded1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.087 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[61c724a8-add4-4e88-a841-5637a86284ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.089 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefaca8b4-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.091 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:03 compute-0 kernel: tapefaca8b4-60: left promiscuous mode
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.117 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.119 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe3fb2a-29ea-4437-b952-57be09a63e01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.145 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ef5e7a2d-0c17-4aa4-8111-6e3c3a9fec2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.147 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[5b680527-a82c-4af8-86c7-08bcf0b38b98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.172 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4936cd3a-afae-4405-aae9-2ca5ed0c6ef8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 326836, 'reachable_time': 26900, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218035, 'error': None, 'target': 'ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.175 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-efaca8b4-60bb-4ba9-b254-02fcfdeb3298 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:37:03 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:03.176 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[57652e1c-5de6-47d4-b541-aafa2dbc063b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:03 compute-0 systemd[1]: run-netns-ovnmeta\x2defaca8b4\x2d60bb\x2d4ba9\x2db254\x2d02fcfdeb3298.mount: Deactivated successfully.
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.787 187122 DEBUG nova.network.neutron [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.804 187122 INFO nova.compute.manager [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Took 0.79 seconds to deallocate network for instance.
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.844 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.845 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.850 187122 DEBUG nova.compute.manager [req-e69efcbc-71b7-45dc-998f-a7bc1e3bb20c req-dacddfc5-b518-487d-919c-8b372e5b63b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-vif-deleted-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.896 187122 DEBUG nova.compute.provider_tree [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.907 187122 DEBUG nova.scheduler.client.report [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.925 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:03 compute-0 nova_compute[187118]: 2025-11-24 14:37:03.946 187122 INFO nova.scheduler.client.report [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 9c8d60c6-3fd4-44f5-bb06-16da6c642889
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.005 187122 DEBUG oslo_concurrency.lockutils [None req-04af0704-c9e1-4175-911e-95471e680957 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.332 187122 DEBUG nova.network.neutron [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updated VIF entry in instance network info cache for port 6ac8169b-b76a-43c1-8baa-2e6aa1db7a50. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.333 187122 DEBUG nova.network.neutron [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Updating instance_info_cache with network_info: [{"id": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "address": "fa:16:3e:a3:64:93", "network": {"id": "efaca8b4-60bb-4ba9-b254-02fcfdeb3298", "bridge": "br-int", "label": "tempest-network-smoke--2131239796", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac8169b-b7", "ovs_interfaceid": "6ac8169b-b76a-43c1-8baa-2e6aa1db7a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.363 187122 DEBUG oslo_concurrency.lockutils [req-6e2d66dc-2d5d-4807-ac1a-b348a3fa9a13 req-efe0ae2d-f786-45ed-8825-8baf3c261bb5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-9c8d60c6-3fd4-44f5-bb06-16da6c642889" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:04 compute-0 podman[218036]: 2025-11-24 14:37:04.444095895 +0000 UTC m=+0.052183614 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.637 187122 DEBUG nova.compute.manager [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-vif-unplugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.637 187122 DEBUG oslo_concurrency.lockutils [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.638 187122 DEBUG oslo_concurrency.lockutils [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.638 187122 DEBUG oslo_concurrency.lockutils [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.638 187122 DEBUG nova.compute.manager [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] No waiting events found dispatching network-vif-unplugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.638 187122 WARNING nova.compute.manager [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received unexpected event network-vif-unplugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 for instance with vm_state deleted and task_state None.
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.639 187122 DEBUG nova.compute.manager [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.639 187122 DEBUG oslo_concurrency.lockutils [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.639 187122 DEBUG oslo_concurrency.lockutils [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.639 187122 DEBUG oslo_concurrency.lockutils [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "9c8d60c6-3fd4-44f5-bb06-16da6c642889-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.639 187122 DEBUG nova.compute.manager [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] No waiting events found dispatching network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.640 187122 WARNING nova.compute.manager [req-110a12e4-1c5e-460c-9a0a-039f58d8799d req-b8e6b86d-f375-4d0c-9757-d4fdb65ebaea 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Received unexpected event network-vif-plugged-6ac8169b-b76a-43c1-8baa-2e6aa1db7a50 for instance with vm_state deleted and task_state None.
Nov 24 14:37:04 compute-0 nova_compute[187118]: 2025-11-24 14:37:04.928 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:07 compute-0 nova_compute[187118]: 2025-11-24 14:37:07.181 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:07 compute-0 nova_compute[187118]: 2025-11-24 14:37:07.284 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:07 compute-0 nova_compute[187118]: 2025-11-24 14:37:07.959 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:08 compute-0 nova_compute[187118]: 2025-11-24 14:37:08.001 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:08.001 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:37:08 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:08.004 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:37:09 compute-0 nova_compute[187118]: 2025-11-24 14:37:09.930 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:10 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:10.007 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:11 compute-0 podman[218061]: 2025-11-24 14:37:11.481898156 +0000 UTC m=+0.075765957 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:37:12 compute-0 nova_compute[187118]: 2025-11-24 14:37:12.962 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:14 compute-0 nova_compute[187118]: 2025-11-24 14:37:14.933 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:17 compute-0 nova_compute[187118]: 2025-11-24 14:37:17.938 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763995022.9362772, 9c8d60c6-3fd4-44f5-bb06-16da6c642889 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:17 compute-0 nova_compute[187118]: 2025-11-24 14:37:17.938 187122 INFO nova.compute.manager [-] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] VM Stopped (Lifecycle Event)
Nov 24 14:37:17 compute-0 nova_compute[187118]: 2025-11-24 14:37:17.959 187122 DEBUG nova.compute.manager [None req-7ebcee2b-f62f-444f-9a0f-6e2eef4d998a - - - - - -] [instance: 9c8d60c6-3fd4-44f5-bb06-16da6c642889] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:17 compute-0 nova_compute[187118]: 2025-11-24 14:37:17.964 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:19 compute-0 podman[218085]: 2025-11-24 14:37:19.467245951 +0000 UTC m=+0.071330104 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Nov 24 14:37:19 compute-0 nova_compute[187118]: 2025-11-24 14:37:19.937 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.697 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.697 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.717 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.817 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.818 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.824 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.825 187122 INFO nova.compute.claims [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.931 187122 DEBUG nova.compute.provider_tree [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.944 187122 DEBUG nova.scheduler.client.report [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.966 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.969 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:22 compute-0 nova_compute[187118]: 2025-11-24 14:37:22.970 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.012 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.013 187122 DEBUG nova.network.neutron [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.036 187122 INFO nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.050 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.125 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.127 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.128 187122 INFO nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Creating image(s)
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.129 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.130 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.131 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.155 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.229 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.230 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.231 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.254 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.323 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.324 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.356 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.358 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.359 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.424 187122 DEBUG nova.policy [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.436 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.437 187122 DEBUG nova.virt.disk.api [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.438 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:23 compute-0 podman[218114]: 2025-11-24 14:37:23.457832187 +0000 UTC m=+0.065881085 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:37:23 compute-0 podman[218115]: 2025-11-24 14:37:23.468832781 +0000 UTC m=+0.069193046 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.510 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.511 187122 DEBUG nova.virt.disk.api [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.511 187122 DEBUG nova.objects.instance [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid e2b8d81d-63e2-4024-80be-476801e2ac7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.529 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.529 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Ensure instance console log exists: /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.530 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.530 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:23 compute-0 nova_compute[187118]: 2025-11-24 14:37:23.530 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:24 compute-0 nova_compute[187118]: 2025-11-24 14:37:24.456 187122 DEBUG nova.network.neutron [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Successfully created port: efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:37:24 compute-0 nova_compute[187118]: 2025-11-24 14:37:24.939 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.379 187122 DEBUG nova.network.neutron [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Successfully updated port: efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.391 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.391 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.391 187122 DEBUG nova.network.neutron [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.450 187122 DEBUG nova.compute.manager [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.450 187122 DEBUG nova.compute.manager [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing instance network info cache due to event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.451 187122 DEBUG oslo_concurrency.lockutils [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:25 compute-0 nova_compute[187118]: 2025-11-24 14:37:25.545 187122 DEBUG nova.network.neutron [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.159 187122 DEBUG nova.network.neutron [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.183 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.183 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Instance network_info: |[{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.184 187122 DEBUG oslo_concurrency.lockutils [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.184 187122 DEBUG nova.network.neutron [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.190 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Start _get_guest_xml network_info=[{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.198 187122 WARNING nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.209 187122 DEBUG nova.virt.libvirt.host [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.210 187122 DEBUG nova.virt.libvirt.host [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.215 187122 DEBUG nova.virt.libvirt.host [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.216 187122 DEBUG nova.virt.libvirt.host [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.217 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.217 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.218 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.219 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.219 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.220 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.220 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.221 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.221 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.221 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.222 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.222 187122 DEBUG nova.virt.hardware [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.228 187122 DEBUG nova.virt.libvirt.vif [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:37:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-242850204',display_name='tempest-TestNetworkBasicOps-server-242850204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-242850204',id=11,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAlHrXi81T9a9pzlsIgsG5PWijsu3mk02tp+/Y4nJIjPkb//9Qu9T4jg7ZzsFaSrbIgz/kmPIQgOPrmf1cgC83C2QFTy/JgfN/28UBP5yyShoUahaNQHScKtyj+wzILH5Q==',key_name='tempest-TestNetworkBasicOps-110487052',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-hqtoo21g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:37:23Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=e2b8d81d-63e2-4024-80be-476801e2ac7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.229 187122 DEBUG nova.network.os_vif_util [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.230 187122 DEBUG nova.network.os_vif_util [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.231 187122 DEBUG nova.objects.instance [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid e2b8d81d-63e2-4024-80be-476801e2ac7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.243 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <uuid>e2b8d81d-63e2-4024-80be-476801e2ac7f</uuid>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <name>instance-0000000b</name>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-242850204</nova:name>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:37:26</nova:creationTime>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         <nova:port uuid="efa50b36-70e9-4adb-b0fb-80e7ba4232c1">
Nov 24 14:37:26 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <system>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <entry name="serial">e2b8d81d-63e2-4024-80be-476801e2ac7f</entry>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <entry name="uuid">e2b8d81d-63e2-4024-80be-476801e2ac7f</entry>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </system>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <os>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </os>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <features>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </features>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.config"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:a3:15:57"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <target dev="tapefa50b36-70"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/console.log" append="off"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <video>
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </video>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:37:26 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:37:26 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:37:26 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:37:26 compute-0 nova_compute[187118]: </domain>
Nov 24 14:37:26 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.245 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Preparing to wait for external event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.245 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.246 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.246 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.247 187122 DEBUG nova.virt.libvirt.vif [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:37:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-242850204',display_name='tempest-TestNetworkBasicOps-server-242850204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-242850204',id=11,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAlHrXi81T9a9pzlsIgsG5PWijsu3mk02tp+/Y4nJIjPkb//9Qu9T4jg7ZzsFaSrbIgz/kmPIQgOPrmf1cgC83C2QFTy/JgfN/28UBP5yyShoUahaNQHScKtyj+wzILH5Q==',key_name='tempest-TestNetworkBasicOps-110487052',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-hqtoo21g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:37:23Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=e2b8d81d-63e2-4024-80be-476801e2ac7f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.247 187122 DEBUG nova.network.os_vif_util [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.248 187122 DEBUG nova.network.os_vif_util [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.248 187122 DEBUG os_vif [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.249 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.249 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.250 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.253 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.253 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefa50b36-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.254 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapefa50b36-70, col_values=(('external_ids', {'iface-id': 'efa50b36-70e9-4adb-b0fb-80e7ba4232c1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:15:57', 'vm-uuid': 'e2b8d81d-63e2-4024-80be-476801e2ac7f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.256 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 NetworkManager[55697]: <info>  [1763995046.2570] manager: (tapefa50b36-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.257 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.263 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.264 187122 INFO os_vif [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70')
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.306 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.306 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.306 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:a3:15:57, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.307 187122 INFO nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Using config drive
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.652 187122 INFO nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Creating config drive at /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.config
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.661 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkxea1ugq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.785 187122 DEBUG oslo_concurrency.processutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkxea1ugq" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:26 compute-0 kernel: tapefa50b36-70: entered promiscuous mode
Nov 24 14:37:26 compute-0 NetworkManager[55697]: <info>  [1763995046.8668] manager: (tapefa50b36-70): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.865 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 ovn_controller[95613]: 2025-11-24T14:37:26Z|00136|binding|INFO|Claiming lport efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for this chassis.
Nov 24 14:37:26 compute-0 ovn_controller[95613]: 2025-11-24T14:37:26Z|00137|binding|INFO|efa50b36-70e9-4adb-b0fb-80e7ba4232c1: Claiming fa:16:3e:a3:15:57 10.100.0.9
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.874 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.878 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 systemd-machined[153483]: New machine qemu-11-instance-0000000b.
Nov 24 14:37:26 compute-0 ovn_controller[95613]: 2025-11-24T14:37:26Z|00138|binding|INFO|Setting lport efa50b36-70e9-4adb-b0fb-80e7ba4232c1 ovn-installed in OVS
Nov 24 14:37:26 compute-0 nova_compute[187118]: 2025-11-24 14:37:26.950 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:26 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Nov 24 14:37:26 compute-0 systemd-udevd[218177]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:37:26 compute-0 NetworkManager[55697]: <info>  [1763995046.9859] device (tapefa50b36-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:37:26 compute-0 NetworkManager[55697]: <info>  [1763995046.9869] device (tapefa50b36-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.359 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995047.3575099, e2b8d81d-63e2-4024-80be-476801e2ac7f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.360 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] VM Started (Lifecycle Event)
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.379 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.384 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995047.3579187, e2b8d81d-63e2-4024-80be-476801e2ac7f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.385 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] VM Paused (Lifecycle Event)
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.401 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.405 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:37:27 compute-0 ovn_controller[95613]: 2025-11-24T14:37:27Z|00139|binding|INFO|Setting lport efa50b36-70e9-4adb-b0fb-80e7ba4232c1 up in Southbound
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.432 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:15:57 10.100.0.9'], port_security=['fa:16:3e:a3:15:57 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8da34392-5850-485a-8d0d-5dd31b6fc169', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23fd8537-aa59-4c32-8488-c8a540b7ddee, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=efa50b36-70e9-4adb-b0fb-80e7ba4232c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.433 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.435 104469 INFO neutron.agent.ovn.metadata.agent [-] Port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 in datapath 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 bound to our chassis
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.437 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.455 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e685f3ce-5a1f-4a53-a225-d61521e86a12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.456 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f40bf56-b1 in ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.458 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f40bf56-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.458 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ad875415-d02d-41d4-adba-b00b8cdf6af5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.460 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[695ffbc6-97d4-4295-8f97-972c471b3fc3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.479 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[c2721833-ed26-4ee4-960d-0f3bedccaef2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.515 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a115cb-fa74-4eb9-bb70-c472e369d755]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.558 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[ee246a92-93f8-4f5f-9edc-24e7d178cacb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 systemd-udevd[218179]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:37:27 compute-0 NetworkManager[55697]: <info>  [1763995047.5658] manager: (tap4f40bf56-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.565 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a25713a8-e493-4de6-88f7-45c76587be6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.599 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[95e44e0b-703b-4ace-a61f-887a4387c22d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.602 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[58e65b12-b020-4a06-bf86-3ab13d97db01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 NetworkManager[55697]: <info>  [1763995047.6277] device (tap4f40bf56-b0): carrier: link connected
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.633 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[bf787e74-19f8-4ca1-888b-fa9d078ecf12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.651 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[2f353803-b417-422d-b403-53be5228ede4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f40bf56-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:1a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331505, 'reachable_time': 33377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218234, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.668 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4deb3a82-6df3-4b2b-8622-937fb1a07bd1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:1a05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331505, 'tstamp': 331505}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218241, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.684 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3dfb67cb-3789-424b-92d6-25be1337c50b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f40bf56-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:1a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331505, 'reachable_time': 33377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218252, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 podman[218216]: 2025-11-24 14:37:27.701845344 +0000 UTC m=+0.089773425 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.717 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[f8cce4c2-76c5-4d46-af64-096b69207512]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 podman[218207]: 2025-11-24 14:37:27.718746051 +0000 UTC m=+0.097682604 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.781 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8faa7907-e250-426c-9dc4-6b3715d7f8e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.783 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f40bf56-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.783 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.784 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f40bf56-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.787 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:27 compute-0 kernel: tap4f40bf56-b0: entered promiscuous mode
Nov 24 14:37:27 compute-0 NetworkManager[55697]: <info>  [1763995047.7889] manager: (tap4f40bf56-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.791 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f40bf56-b0, col_values=(('external_ids', {'iface-id': 'aa5ffafb-d507-447e-b6a0-062a4b8e8014'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.792 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:27 compute-0 ovn_controller[95613]: 2025-11-24T14:37:27Z|00140|binding|INFO|Releasing lport aa5ffafb-d507-447e-b6a0-062a4b8e8014 from this chassis (sb_readonly=0)
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.793 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.795 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.796 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[7acf0c43-1d1d-439c-98d9-dc0c6353fa3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.797 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6.pid.haproxy
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:37:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:27.798 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'env', 'PROCESS_TAG=haproxy-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:37:27 compute-0 nova_compute[187118]: 2025-11-24 14:37:27.803 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.094 187122 DEBUG nova.network.neutron [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updated VIF entry in instance network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.094 187122 DEBUG nova.network.neutron [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.106 187122 DEBUG oslo_concurrency.lockutils [req-5ad0039b-ff35-4838-8419-b3d2ddb6aa8a req-897a387d-2883-4500-9f99-932ff43c7031 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:28 compute-0 podman[218294]: 2025-11-24 14:37:28.246094331 +0000 UTC m=+0.084187100 container create 9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.257 187122 DEBUG nova.compute.manager [req-84e1cf28-2d0a-4ffe-a3b6-dd4895d626f0 req-6c6239d6-eaab-4b29-9494-7ecb07af1d32 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.258 187122 DEBUG oslo_concurrency.lockutils [req-84e1cf28-2d0a-4ffe-a3b6-dd4895d626f0 req-6c6239d6-eaab-4b29-9494-7ecb07af1d32 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.258 187122 DEBUG oslo_concurrency.lockutils [req-84e1cf28-2d0a-4ffe-a3b6-dd4895d626f0 req-6c6239d6-eaab-4b29-9494-7ecb07af1d32 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.258 187122 DEBUG oslo_concurrency.lockutils [req-84e1cf28-2d0a-4ffe-a3b6-dd4895d626f0 req-6c6239d6-eaab-4b29-9494-7ecb07af1d32 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.258 187122 DEBUG nova.compute.manager [req-84e1cf28-2d0a-4ffe-a3b6-dd4895d626f0 req-6c6239d6-eaab-4b29-9494-7ecb07af1d32 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Processing event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.259 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.263 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995048.262471, e2b8d81d-63e2-4024-80be-476801e2ac7f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.263 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] VM Resumed (Lifecycle Event)
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.265 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.269 187122 INFO nova.virt.libvirt.driver [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Instance spawned successfully.
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.270 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.285 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:28 compute-0 systemd[1]: Started libpod-conmon-9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11.scope.
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.292 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:37:28 compute-0 podman[218294]: 2025-11-24 14:37:28.203831162 +0000 UTC m=+0.041924001 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.295 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.296 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.296 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.296 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.297 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.297 187122 DEBUG nova.virt.libvirt.driver [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:28 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:37:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/615bc42fd8ce61e8917f6387703b85de5f3dfb3fe698b370d819eeb691e704df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.328 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:37:28 compute-0 podman[218294]: 2025-11-24 14:37:28.337327344 +0000 UTC m=+0.175420133 container init 9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:37:28 compute-0 podman[218294]: 2025-11-24 14:37:28.34402765 +0000 UTC m=+0.182120409 container start 9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3)
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.359 187122 INFO nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Took 5.23 seconds to spawn the instance on the hypervisor.
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.359 187122 DEBUG nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:28 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [NOTICE]   (218314) : New worker (218316) forked
Nov 24 14:37:28 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [NOTICE]   (218314) : Loading success.
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.432 187122 INFO nova.compute.manager [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Took 5.65 seconds to build instance.
Nov 24 14:37:28 compute-0 nova_compute[187118]: 2025-11-24 14:37:28.446 187122 DEBUG oslo_concurrency.lockutils [None req-22891100-d1ba-49b9-b40d-bdddbc37b525 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:29 compute-0 nova_compute[187118]: 2025-11-24 14:37:29.941 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:30 compute-0 nova_compute[187118]: 2025-11-24 14:37:30.331 187122 DEBUG nova.compute.manager [req-093a348d-4e64-4e26-a17e-81e2757ec70f req-3583f2f2-98e4-4df2-86a0-8b7ab198936a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:30 compute-0 nova_compute[187118]: 2025-11-24 14:37:30.332 187122 DEBUG oslo_concurrency.lockutils [req-093a348d-4e64-4e26-a17e-81e2757ec70f req-3583f2f2-98e4-4df2-86a0-8b7ab198936a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:30 compute-0 nova_compute[187118]: 2025-11-24 14:37:30.332 187122 DEBUG oslo_concurrency.lockutils [req-093a348d-4e64-4e26-a17e-81e2757ec70f req-3583f2f2-98e4-4df2-86a0-8b7ab198936a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:30 compute-0 nova_compute[187118]: 2025-11-24 14:37:30.332 187122 DEBUG oslo_concurrency.lockutils [req-093a348d-4e64-4e26-a17e-81e2757ec70f req-3583f2f2-98e4-4df2-86a0-8b7ab198936a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:30 compute-0 nova_compute[187118]: 2025-11-24 14:37:30.333 187122 DEBUG nova.compute.manager [req-093a348d-4e64-4e26-a17e-81e2757ec70f req-3583f2f2-98e4-4df2-86a0-8b7ab198936a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:37:30 compute-0 nova_compute[187118]: 2025-11-24 14:37:30.333 187122 WARNING nova.compute.manager [req-093a348d-4e64-4e26-a17e-81e2757ec70f req-3583f2f2-98e4-4df2-86a0-8b7ab198936a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received unexpected event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with vm_state active and task_state None.
Nov 24 14:37:31 compute-0 nova_compute[187118]: 2025-11-24 14:37:31.258 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:34 compute-0 nova_compute[187118]: 2025-11-24 14:37:34.944 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.134 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'name': 'tempest-TestNetworkBasicOps-server-242850204', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0b17c7cc946a4f86aea7e5b323e88562', 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'hostId': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.135 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.156 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.156 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance e2b8d81d-63e2-4024-80be-476801e2ac7f: ceilometer.compute.pollsters.NoVolumeException
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.156 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.192 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.193 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f36424d1-98c4-458e-97c6-747972cbcf95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.157156', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e02138c-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '75a03b43dc049fcc48048a88fa44b6af0295544e083327c3e2b14c09da8edfa1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.157156', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e022822-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '58f0f199b2ef633722c9c05a2e4d1a68220bb580182167f633e7dbd4e1a7b70e'}]}, 'timestamp': '2025-11-24 14:37:35.193758', '_unique_id': 'fa7908aeea3f4dffa322800259436e8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.195 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.199 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e2b8d81d-63e2-4024-80be-476801e2ac7f / tapefa50b36-70 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.200 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3428a5cd-39c1-4156-8791-70e61e13cb64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.196971', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e0331a4-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': '11fbd63eb0d9b50ba233d7af7ae8205d47b1fec02dd99cc1b13f14841032e345'}]}, 'timestamp': '2025-11-24 14:37:35.200547', '_unique_id': 'e9eb4e1ae2c14f32aed64cbddee6dd32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.201 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.203 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f6494db-c1b1-4032-b430-da5ea5859a61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.203062', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e03a800-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': 'b958865f44714c9e6d7b20770865cf9bd10dd0b60d4a45ff6ffa448c996a0e97'}]}, 'timestamp': '2025-11-24 14:37:35.203568', '_unique_id': '6ad8447d099a47aab1b800361190ff30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.204 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.206 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da01dc93-493a-418f-8b1d-f13091ed6665', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.205970', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e041966-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': 'a1ec02dfdc35563c0ee15e7f80dc3ee7d0249f249fae9e50407795ac5743cb9e'}]}, 'timestamp': '2025-11-24 14:37:35.206472', '_unique_id': '52304f4e3d2745668a9ef2a1934d9a87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.207 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.209 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.read.latency volume: 364896145 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.209 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.read.latency volume: 3979650 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd7fa6586-9d71-4459-8f35-133f8ad0f5f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 364896145, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.209070', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e049274-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '42a1b2235551b9b9931e63b819fcb43a687ca42c35a126d72e20f22b3151898f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3979650, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.209070', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e04a606-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': 'b1b1e9f8c7940d5b68dfe4152de2b6f4f8e6d97279af27e66746ad3954e92c81'}]}, 'timestamp': '2025-11-24 14:37:35.210102', '_unique_id': '59ae3feb462c416195a6e7d2359a217c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.211 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.212 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.213 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d1dcc38-81bd-4b36-a929-951257ca3abe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.212850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e05264e-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '9a9221a324e22aaddfa94e701efed05b4827d15bb82eca06d024ad19c4b019be'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.212850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e053774-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': 'd7cb97226ebf171c4188c74b0888e59a66d215b03bcbd2c458e860793bed88ce'}]}, 'timestamp': '2025-11-24 14:37:35.213803', '_unique_id': 'a23f7c3a753c446582700f99f7ae8c9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.214 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.216 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.216 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.216 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>]
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.216 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a391cb4a-ca1b-4dd2-a19b-5af4c26e6017', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.216944', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e05c5e0-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': 'd8ad874e843f652c903338ab779001da3856e509411f18e059a1f173cc6d410a'}]}, 'timestamp': '2025-11-24 14:37:35.217441', '_unique_id': '5dad10f102ce498781917659ea8f8e11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.218 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.219 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.220 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>]
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.235 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.236 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21366cd5-61ed-4b51-b587-b1687280db35', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.220421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e089bd0-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.709864613, 'message_signature': 'df15a583015fcf0c31714649549351b1c2ebe2734b9bfb39eaf53e3294bbf90f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.220421', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e08adc8-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.709864613, 'message_signature': 'd71ce3512a4eee32ebd4b38f09e8b72f6dc85c502b2946367769a197a340257e'}]}, 'timestamp': '2025-11-24 14:37:35.236452', '_unique_id': 'be6ed8be84e74ea8b5589b9b87cee462'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.237 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.239 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea9f3078-034f-4e47-ba93-65fb29394cef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.239381', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e093464-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': 'ed38540b3b17f0cb77ed35a04c27b30fa388e3b06b1fcd8abe4944c969694a23'}]}, 'timestamp': '2025-11-24 14:37:35.239946', '_unique_id': 'de94acb82b364a6791425c93d3e513f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.241 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.242 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.242 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93407887-eadb-44dc-b8bd-1a4fbecbc08a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.242410', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e09abe2-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.709864613, 'message_signature': 'cbd7dbbc5d45a29c58e414b102bf4cba72bb949e374371f736121561911eafe8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.242410', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e09bd80-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.709864613, 'message_signature': 'cb522356b46ab6133688994eee8e4b03d584ab61631c0c70af691b3b9b3e485b'}]}, 'timestamp': '2025-11-24 14:37:35.243405', '_unique_id': '943cf397c5fb49f793d05fe75a351992'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.244 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.245 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.246 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73df258f-009e-47cf-b0a9-6a2348dd663d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.245765', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e0a2b94-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': 'e93e22f5aec552f038ff16aa03af351494a8eb1a561c399b4f220d3f5d8bc39a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.245765', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e0a3c4c-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '7615e5dafa42c780f9957f89e0cbe6573d13a46ede278f0eecbf7dbbe547cdc9'}]}, 'timestamp': '2025-11-24 14:37:35.246710', '_unique_id': '4e62611922704b918b5405dc3ce54b11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.247 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.249 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.249 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b893df6d-1c33-4583-9542-eb46de901585', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.249001', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e0aa9a2-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.709864613, 'message_signature': '46cc68822de09da3f67554cada7252d11c0affe766ab86813a4f31b814ccbc33'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.249001', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e0abb86-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.709864613, 'message_signature': '74eb94f1ae5223733b449285ff97bbae626e282c964e0fc19eea591b248511e9'}]}, 'timestamp': '2025-11-24 14:37:35.249911', '_unique_id': '8198b4712b7944f58cf7855802d9b6ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.250 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.252 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.252 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73f2de10-3784-49be-848c-4f9a795dbca7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.252400', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e0b302a-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '4822087051c870ee7128ccf7a59e5d402a1dda99f1b513cc5523c5927f0e2090'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.252400', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e0b415a-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '75c66c122ec25dca8f7f17f8f537e180978f645f2b3e5907e9448a45a33ff2a3'}]}, 'timestamp': '2025-11-24 14:37:35.253335', '_unique_id': 'b30ca1158f274245b6351ac3f0a9953f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.254 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.255 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.255 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>]
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.256 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a3f8bcc-f197-4fe3-8cdd-7ff18662242f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.256324', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e0bc7c4-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': '95e2561450250b5ccd317270b0dd837b4c51ef957c323d87e085d52e7a0a8313'}]}, 'timestamp': '2025-11-24 14:37:35.256833', '_unique_id': 'c4504015db3f4dfa9ba663a235e9f984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.257 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.259 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.259 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.259 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '758368f6-a2fa-422c-a37c-f0f83da72d05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-vda', 'timestamp': '2025-11-24T14:37:35.259223', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1e0c393e-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': '7d9ee9c8a6038e3ec96621a65b832074df2d4e2966417e02e153cde1d470bd27'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f-sda', 'timestamp': '2025-11-24T14:37:35.259223', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1e0c50d6-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64661034, 'message_signature': 'e97ee9e341880be1a57ecdf731156096539b05c8d03a96771cfd0391e26243af'}]}, 'timestamp': '2025-11-24 14:37:35.260288', '_unique_id': '861cc4bf0932417296afa9b476d9d992'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.261 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.262 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.262 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/cpu volume: 6650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f68e483-83ee-4d9a-a50d-82767dfb40ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6650000000, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'timestamp': '2025-11-24T14:37:35.262850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'instance-0000000b', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1e0cc700-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.64552668, 'message_signature': 'fbe428ce8a42883eaf8e37babf48a88d6640c4f7700d1f4936a84e33533af323'}]}, 'timestamp': '2025-11-24 14:37:35.263326', '_unique_id': '78f9c213e7bc4990b99c617086cd0d8b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.264 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.265 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ee44b13-a0d4-4fbc-b937-6e8812761180', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.265572', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e0d3276-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': '46905f5183fb400dcaf2efc3bdb587963ebd3d3bd5b755b0423d2336d2485711'}]}, 'timestamp': '2025-11-24 14:37:35.266092', '_unique_id': 'cccc3dd235be48989c5527b9cafb1bf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.267 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.268 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.268 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.268 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-242850204>]
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.269 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.269 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea80f3be-2948-4785-8ac5-b1be758aed4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.269298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e0dc2d6-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': 'a047dbef936b87608fb28027a3eca0155b7f415f374bb9855391ab65328a81eb'}]}, 'timestamp': '2025-11-24 14:37:35.269821', '_unique_id': '14ee6daa135843ac8b28a6d787b70970'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.270 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.271 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba0c3158-811d-4814-952d-892dc5cd5744', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.271918', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e0e25c8-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': '59b02fff730242fcbee4155ddfd0eadbee2eec4191032687aaa332489d4b0b92'}]}, 'timestamp': '2025-11-24 14:37:35.272239', '_unique_id': '4674d58e30fc45578ed1b6cec1c3979a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.273 12 DEBUG ceilometer.compute.pollsters [-] e2b8d81d-63e2-4024-80be-476801e2ac7f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89b8b6ba-5641-4897-8eb2-e935cdb4217b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_name': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_name': None, 'resource_id': 'instance-0000000b-e2b8d81d-63e2-4024-80be-476801e2ac7f-tapefa50b36-70', 'timestamp': '2025-11-24T14:37:35.273925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-242850204', 'name': 'tapefa50b36-70', 'instance_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'instance_type': 'm1.nano', 'host': '14f67c347d8666d48fa5793ae02ca92ca4cabc02813b4e424988ed4a', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': '6e922a91-f8b6-466b-9721-3ed72f453145', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}, 'image_ref': '54a328f6-92ea-410e-beaf-ba04bab9ef9a', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:15:57', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapefa50b36-70'}, 'message_id': '1e0e7582-c943-11f0-9454-fa163e7ea22e', 'monotonic_time': 3322.68642213, 'message_signature': 'b2f83f51b765efa8d1f33c541d30113afb70257f6b30e3ef6822cbcad79ec57c'}]}, 'timestamp': '2025-11-24 14:37:35.274299', '_unique_id': '31a19707b93f48f492a9e76e073a6691'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     yield
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 24 14:37:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:37:35.275 12 ERROR oslo_messaging.notify.messaging 
Nov 24 14:37:35 compute-0 podman[218325]: 2025-11-24 14:37:35.457475705 +0000 UTC m=+0.062429691 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:37:35 compute-0 NetworkManager[55697]: <info>  [1763995055.5192] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Nov 24 14:37:35 compute-0 ovn_controller[95613]: 2025-11-24T14:37:35Z|00141|binding|INFO|Releasing lport aa5ffafb-d507-447e-b6a0-062a4b8e8014 from this chassis (sb_readonly=0)
Nov 24 14:37:35 compute-0 NetworkManager[55697]: <info>  [1763995055.5201] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Nov 24 14:37:35 compute-0 nova_compute[187118]: 2025-11-24 14:37:35.519 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:35 compute-0 ovn_controller[95613]: 2025-11-24T14:37:35Z|00142|binding|INFO|Releasing lport aa5ffafb-d507-447e-b6a0-062a4b8e8014 from this chassis (sb_readonly=0)
Nov 24 14:37:35 compute-0 nova_compute[187118]: 2025-11-24 14:37:35.571 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:35 compute-0 nova_compute[187118]: 2025-11-24 14:37:35.581 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:36 compute-0 nova_compute[187118]: 2025-11-24 14:37:36.240 187122 DEBUG nova.compute.manager [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:36 compute-0 nova_compute[187118]: 2025-11-24 14:37:36.241 187122 DEBUG nova.compute.manager [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing instance network info cache due to event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:37:36 compute-0 nova_compute[187118]: 2025-11-24 14:37:36.241 187122 DEBUG oslo_concurrency.lockutils [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:36 compute-0 nova_compute[187118]: 2025-11-24 14:37:36.242 187122 DEBUG oslo_concurrency.lockutils [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:36 compute-0 nova_compute[187118]: 2025-11-24 14:37:36.242 187122 DEBUG nova.network.neutron [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:37:36 compute-0 nova_compute[187118]: 2025-11-24 14:37:36.261 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:37 compute-0 nova_compute[187118]: 2025-11-24 14:37:37.238 187122 DEBUG nova.network.neutron [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updated VIF entry in instance network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:37:37 compute-0 nova_compute[187118]: 2025-11-24 14:37:37.239 187122 DEBUG nova.network.neutron [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:37 compute-0 nova_compute[187118]: 2025-11-24 14:37:37.258 187122 DEBUG oslo_concurrency.lockutils [req-a83a6a78-abab-41a9-bfee-7bdaf2b7549c req-e4efdd09-a30c-4a08-ad9d-0efc631178e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:37 compute-0 nova_compute[187118]: 2025-11-24 14:37:37.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:38 compute-0 nova_compute[187118]: 2025-11-24 14:37:38.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:39 compute-0 nova_compute[187118]: 2025-11-24 14:37:39.946 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.821 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.821 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.822 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.822 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.898 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.956 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:40 compute-0 nova_compute[187118]: 2025-11-24 14:37:40.957 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.048 187122 DEBUG oslo_concurrency.processutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.204 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.205 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.221 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.222 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5557MB free_disk=73.43004989624023GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.222 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.222 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.226 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.264 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:41 compute-0 ovn_controller[95613]: 2025-11-24T14:37:41Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:15:57 10.100.0.9
Nov 24 14:37:41 compute-0 ovn_controller[95613]: 2025-11-24T14:37:41Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:15:57 10.100.0.9
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.343 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.354 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance e2b8d81d-63e2-4024-80be-476801e2ac7f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.378 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Instance 137f208f-228c-4e2f-9395-79c5d643c17a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.379 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.379 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.450 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.466 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.490 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.490 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.491 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.499 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.500 187122 INFO nova.compute.claims [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.632 187122 DEBUG nova.compute.provider_tree [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.643 187122 DEBUG nova.scheduler.client.report [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.672 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.673 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.730 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.730 187122 DEBUG nova.network.neutron [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.751 187122 INFO nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.771 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.858 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.859 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.860 187122 INFO nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Creating image(s)
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.860 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.861 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.861 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.878 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.968 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.969 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.970 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:41 compute-0 nova_compute[187118]: 2025-11-24 14:37:41.986 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.051 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.052 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.219 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk 1073741824" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.221 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.250s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.221 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.275 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.276 187122 DEBUG nova.virt.disk.api [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.276 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.369 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.370 187122 DEBUG nova.virt.disk.api [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.371 187122 DEBUG nova.objects.instance [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid 137f208f-228c-4e2f-9395-79c5d643c17a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.407 187122 DEBUG nova.policy [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.413 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.414 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Ensure instance console log exists: /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.415 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.416 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:42 compute-0 nova_compute[187118]: 2025-11-24 14:37:42.417 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:42 compute-0 podman[218386]: 2025-11-24 14:37:42.455784431 +0000 UTC m=+0.058889354 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.441 187122 DEBUG nova.network.neutron [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Successfully created port: 4bc0703d-e5d0-4012-b8af-218ac012eb92 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.490 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.491 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.491 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.506 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.641 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.641 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.642 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Nov 24 14:37:43 compute-0 nova_compute[187118]: 2025-11-24 14:37:43.642 187122 DEBUG nova.objects.instance [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e2b8d81d-63e2-4024-80be-476801e2ac7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.536 187122 DEBUG nova.network.neutron [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Successfully updated port: 4bc0703d-e5d0-4012-b8af-218ac012eb92 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.552 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.553 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.553 187122 DEBUG nova.network.neutron [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.618 187122 DEBUG nova.compute.manager [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-changed-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.618 187122 DEBUG nova.compute.manager [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Refreshing instance network info cache due to event network-changed-4bc0703d-e5d0-4012-b8af-218ac012eb92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.619 187122 DEBUG oslo_concurrency.lockutils [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.704 187122 DEBUG nova.network.neutron [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:37:44 compute-0 nova_compute[187118]: 2025-11-24 14:37:44.947 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.398 187122 DEBUG nova.network.neutron [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.412 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.413 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.413 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.414 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.414 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.415 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.616 187122 DEBUG nova.network.neutron [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updating instance_info_cache with network_info: [{"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.631 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.632 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Instance network_info: |[{"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.632 187122 DEBUG oslo_concurrency.lockutils [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.632 187122 DEBUG nova.network.neutron [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Refreshing network info cache for port 4bc0703d-e5d0-4012-b8af-218ac012eb92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.635 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Start _get_guest_xml network_info=[{"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.640 187122 WARNING nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.644 187122 DEBUG nova.virt.libvirt.host [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.645 187122 DEBUG nova.virt.libvirt.host [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.652 187122 DEBUG nova.virt.libvirt.host [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.653 187122 DEBUG nova.virt.libvirt.host [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.653 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.654 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.654 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.654 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.655 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.655 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.655 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.655 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.656 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.656 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.656 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.656 187122 DEBUG nova.virt.hardware [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.660 187122 DEBUG nova.virt.libvirt.vif [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:37:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-975957140',display_name='tempest-TestNetworkBasicOps-server-975957140',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-975957140',id=12,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxNy0dBa04REfvaesgczGDro8bdjrxvI6RHFS+sMtDcQUVQcSPAN48ZwR7KM8mvxUtNFIHbPfR/lPzg6WR96yLxUnQdHJnNOkgv1D0zLFDDPkkqNWEl20LKoSGubz5pXQ==',key_name='tempest-TestNetworkBasicOps-1145703243',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-9chpac4o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:37:41Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=137f208f-228c-4e2f-9395-79c5d643c17a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.660 187122 DEBUG nova.network.os_vif_util [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.661 187122 DEBUG nova.network.os_vif_util [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.661 187122 DEBUG nova.objects.instance [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid 137f208f-228c-4e2f-9395-79c5d643c17a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.674 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <uuid>137f208f-228c-4e2f-9395-79c5d643c17a</uuid>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <name>instance-0000000c</name>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-975957140</nova:name>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:37:45</nova:creationTime>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         <nova:port uuid="4bc0703d-e5d0-4012-b8af-218ac012eb92">
Nov 24 14:37:45 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <system>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <entry name="serial">137f208f-228c-4e2f-9395-79c5d643c17a</entry>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <entry name="uuid">137f208f-228c-4e2f-9395-79c5d643c17a</entry>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </system>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <os>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </os>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <features>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </features>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.config"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:5b:68:e5"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <target dev="tap4bc0703d-e5"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/console.log" append="off"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <video>
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </video>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:37:45 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:37:45 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:37:45 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:37:45 compute-0 nova_compute[187118]: </domain>
Nov 24 14:37:45 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.675 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Preparing to wait for external event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.676 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.676 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.676 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.677 187122 DEBUG nova.virt.libvirt.vif [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:37:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-975957140',display_name='tempest-TestNetworkBasicOps-server-975957140',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-975957140',id=12,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxNy0dBa04REfvaesgczGDro8bdjrxvI6RHFS+sMtDcQUVQcSPAN48ZwR7KM8mvxUtNFIHbPfR/lPzg6WR96yLxUnQdHJnNOkgv1D0zLFDDPkkqNWEl20LKoSGubz5pXQ==',key_name='tempest-TestNetworkBasicOps-1145703243',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-9chpac4o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:37:41Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=137f208f-228c-4e2f-9395-79c5d643c17a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.677 187122 DEBUG nova.network.os_vif_util [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.678 187122 DEBUG nova.network.os_vif_util [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.678 187122 DEBUG os_vif [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.679 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.679 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.680 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.682 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.682 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4bc0703d-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.683 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4bc0703d-e5, col_values=(('external_ids', {'iface-id': '4bc0703d-e5d0-4012-b8af-218ac012eb92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:68:e5', 'vm-uuid': '137f208f-228c-4e2f-9395-79c5d643c17a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.684 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:45 compute-0 NetworkManager[55697]: <info>  [1763995065.6856] manager: (tap4bc0703d-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.687 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.691 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.691 187122 INFO os_vif [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5')
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.768 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.768 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.769 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:5b:68:e5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:37:45 compute-0 nova_compute[187118]: 2025-11-24 14:37:45.769 187122 INFO nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Using config drive
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.347 187122 INFO nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Creating config drive at /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.config
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.353 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfhyb6439 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.476 187122 DEBUG oslo_concurrency.processutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfhyb6439" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:37:46 compute-0 kernel: tap4bc0703d-e5: entered promiscuous mode
Nov 24 14:37:46 compute-0 NetworkManager[55697]: <info>  [1763995066.5256] manager: (tap4bc0703d-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Nov 24 14:37:46 compute-0 ovn_controller[95613]: 2025-11-24T14:37:46Z|00143|binding|INFO|Claiming lport 4bc0703d-e5d0-4012-b8af-218ac012eb92 for this chassis.
Nov 24 14:37:46 compute-0 ovn_controller[95613]: 2025-11-24T14:37:46Z|00144|binding|INFO|4bc0703d-e5d0-4012-b8af-218ac012eb92: Claiming fa:16:3e:5b:68:e5 10.100.0.11
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.529 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.545 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:46 compute-0 ovn_controller[95613]: 2025-11-24T14:37:46Z|00145|binding|INFO|Setting lport 4bc0703d-e5d0-4012-b8af-218ac012eb92 ovn-installed in OVS
Nov 24 14:37:46 compute-0 ovn_controller[95613]: 2025-11-24T14:37:46Z|00146|binding|INFO|Setting lport 4bc0703d-e5d0-4012-b8af-218ac012eb92 up in Southbound
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.546 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:68:e5 10.100.0.11'], port_security=['fa:16:3e:5b:68:e5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '137f208f-228c-4e2f-9395-79c5d643c17a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a71ff4a1-4692-40dc-a195-bd7cee824485', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23fd8537-aa59-4c32-8488-c8a540b7ddee, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=4bc0703d-e5d0-4012-b8af-218ac012eb92) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.547 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.550 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 4bc0703d-e5d0-4012-b8af-218ac012eb92 in datapath 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 bound to our chassis
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.552 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6
Nov 24 14:37:46 compute-0 systemd-udevd[218427]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:37:46 compute-0 systemd-machined[153483]: New machine qemu-12-instance-0000000c.
Nov 24 14:37:46 compute-0 NetworkManager[55697]: <info>  [1763995066.5717] device (tap4bc0703d-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:37:46 compute-0 NetworkManager[55697]: <info>  [1763995066.5723] device (tap4bc0703d-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.570 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[884e61aa-88e4-4b24-9b66-737d3713ac04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:46 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.599 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[182afe02-1d3b-4ca0-ac5f-bd428bec00b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.603 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[08bebd1d-1d38-44fc-bedd-4b83a19e5ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.631 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[e9a78c73-baf1-4559-b9e1-20186b6f3fde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.645 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[43f8c3c2-5a22-4d9f-a28e-19f4b340482f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f40bf56-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:1a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331505, 'reachable_time': 33377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218442, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.662 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3a5a7d10-d20c-4e78-9d58-fcb847de4ba8]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4f40bf56-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331517, 'tstamp': 331517}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218443, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4f40bf56-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331520, 'tstamp': 331520}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218443, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.663 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f40bf56-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.665 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.666 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f40bf56-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.666 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.667 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f40bf56-b0, col_values=(('external_ids', {'iface-id': 'aa5ffafb-d507-447e-b6a0-062a4b8e8014'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:37:46 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:46.667 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.757 187122 DEBUG nova.compute.manager [req-9ea399a0-4e4b-4283-b755-d101fedc747f req-c740a286-ab19-4af0-be01-71adb1b8919a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.757 187122 DEBUG oslo_concurrency.lockutils [req-9ea399a0-4e4b-4283-b755-d101fedc747f req-c740a286-ab19-4af0-be01-71adb1b8919a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.758 187122 DEBUG oslo_concurrency.lockutils [req-9ea399a0-4e4b-4283-b755-d101fedc747f req-c740a286-ab19-4af0-be01-71adb1b8919a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.758 187122 DEBUG oslo_concurrency.lockutils [req-9ea399a0-4e4b-4283-b755-d101fedc747f req-c740a286-ab19-4af0-be01-71adb1b8919a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.758 187122 DEBUG nova.compute.manager [req-9ea399a0-4e4b-4283-b755-d101fedc747f req-c740a286-ab19-4af0-be01-71adb1b8919a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Processing event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.898 187122 DEBUG nova.network.neutron [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updated VIF entry in instance network info cache for port 4bc0703d-e5d0-4012-b8af-218ac012eb92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.899 187122 DEBUG nova.network.neutron [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updating instance_info_cache with network_info: [{"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.913 187122 DEBUG oslo_concurrency.lockutils [req-cfa09c17-04a8-48b1-a665-1f85c966a640 req-cf448835-7575-43d0-b70e-442158e73808 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.955 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995066.954749, 137f208f-228c-4e2f-9395-79c5d643c17a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.955 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] VM Started (Lifecycle Event)
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.957 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.960 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.963 187122 INFO nova.virt.libvirt.driver [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Instance spawned successfully.
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.963 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.972 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.976 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.980 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.980 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.980 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.980 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.981 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:46 compute-0 nova_compute[187118]: 2025-11-24 14:37:46.981 187122 DEBUG nova.virt.libvirt.driver [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.002 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.003 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995066.9549756, 137f208f-228c-4e2f-9395-79c5d643c17a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.003 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] VM Paused (Lifecycle Event)
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.023 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.026 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995066.9599688, 137f208f-228c-4e2f-9395-79c5d643c17a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.026 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] VM Resumed (Lifecycle Event)
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.039 187122 INFO nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Took 5.18 seconds to spawn the instance on the hypervisor.
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.039 187122 DEBUG nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.040 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.045 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.068 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.088 187122 INFO nova.compute.manager [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Took 5.79 seconds to build instance.
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.099 187122 DEBUG oslo_concurrency.lockutils [None req-5096e466-14ca-438e-9812-8a2000e52c9d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:47 compute-0 nova_compute[187118]: 2025-11-24 14:37:47.715 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.851 187122 DEBUG nova.compute.manager [req-17574b34-63ea-4a06-ae0f-b573825e9f9f req-9279870c-5cb9-4446-a5da-732ab61c6062 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.851 187122 DEBUG oslo_concurrency.lockutils [req-17574b34-63ea-4a06-ae0f-b573825e9f9f req-9279870c-5cb9-4446-a5da-732ab61c6062 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.852 187122 DEBUG oslo_concurrency.lockutils [req-17574b34-63ea-4a06-ae0f-b573825e9f9f req-9279870c-5cb9-4446-a5da-732ab61c6062 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.852 187122 DEBUG oslo_concurrency.lockutils [req-17574b34-63ea-4a06-ae0f-b573825e9f9f req-9279870c-5cb9-4446-a5da-732ab61c6062 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.852 187122 DEBUG nova.compute.manager [req-17574b34-63ea-4a06-ae0f-b573825e9f9f req-9279870c-5cb9-4446-a5da-732ab61c6062 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] No waiting events found dispatching network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:37:48 compute-0 nova_compute[187118]: 2025-11-24 14:37:48.852 187122 WARNING nova.compute.manager [req-17574b34-63ea-4a06-ae0f-b573825e9f9f req-9279870c-5cb9-4446-a5da-732ab61c6062 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received unexpected event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 for instance with vm_state active and task_state None.
Nov 24 14:37:49 compute-0 nova_compute[187118]: 2025-11-24 14:37:49.950 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:49 compute-0 nova_compute[187118]: 2025-11-24 14:37:49.977 187122 DEBUG nova.compute.manager [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-changed-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:37:49 compute-0 nova_compute[187118]: 2025-11-24 14:37:49.977 187122 DEBUG nova.compute.manager [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Refreshing instance network info cache due to event network-changed-4bc0703d-e5d0-4012-b8af-218ac012eb92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:37:49 compute-0 nova_compute[187118]: 2025-11-24 14:37:49.978 187122 DEBUG oslo_concurrency.lockutils [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:37:49 compute-0 nova_compute[187118]: 2025-11-24 14:37:49.979 187122 DEBUG oslo_concurrency.lockutils [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:37:49 compute-0 nova_compute[187118]: 2025-11-24 14:37:49.979 187122 DEBUG nova.network.neutron [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Refreshing network info cache for port 4bc0703d-e5d0-4012-b8af-218ac012eb92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:37:50 compute-0 podman[218452]: 2025-11-24 14:37:50.457878951 +0000 UTC m=+0.064950280 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Nov 24 14:37:50 compute-0 nova_compute[187118]: 2025-11-24 14:37:50.685 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:54 compute-0 nova_compute[187118]: 2025-11-24 14:37:54.354 187122 DEBUG nova.network.neutron [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updated VIF entry in instance network info cache for port 4bc0703d-e5d0-4012-b8af-218ac012eb92. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:37:54 compute-0 nova_compute[187118]: 2025-11-24 14:37:54.354 187122 DEBUG nova.network.neutron [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updating instance_info_cache with network_info: [{"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:37:54 compute-0 nova_compute[187118]: 2025-11-24 14:37:54.373 187122 DEBUG oslo_concurrency.lockutils [req-716c902a-3736-4650-bf58-1fcd27c1d69f req-49c2e43a-dc73-43bf-9306-68a524214688 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:37:54 compute-0 podman[218472]: 2025-11-24 14:37:54.474931779 +0000 UTC m=+0.079956852 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 24 14:37:54 compute-0 podman[218473]: 2025-11-24 14:37:54.515018037 +0000 UTC m=+0.109028718 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:37:54 compute-0 nova_compute[187118]: 2025-11-24 14:37:54.953 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:55 compute-0 nova_compute[187118]: 2025-11-24 14:37:55.687 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:37:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:56.664 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:37:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:56.665 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:37:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:37:56.666 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:37:58 compute-0 podman[218512]: 2025-11-24 14:37:58.483806902 +0000 UTC m=+0.079957813 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter)
Nov 24 14:37:58 compute-0 podman[218511]: 2025-11-24 14:37:58.484881841 +0000 UTC m=+0.092281020 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:37:59 compute-0 nova_compute[187118]: 2025-11-24 14:37:59.954 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:00 compute-0 nova_compute[187118]: 2025-11-24 14:38:00.689 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:00 compute-0 ovn_controller[95613]: 2025-11-24T14:38:00Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5b:68:e5 10.100.0.11
Nov 24 14:38:00 compute-0 ovn_controller[95613]: 2025-11-24T14:38:00Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5b:68:e5 10.100.0.11
Nov 24 14:38:04 compute-0 nova_compute[187118]: 2025-11-24 14:38:04.956 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:05 compute-0 nova_compute[187118]: 2025-11-24 14:38:05.692 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:06 compute-0 podman[218571]: 2025-11-24 14:38:06.483088259 +0000 UTC m=+0.080934588 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:38:07 compute-0 sshd-session[218570]: Connection closed by authenticating user root 80.94.95.115 port 48414 [preauth]
Nov 24 14:38:09 compute-0 nova_compute[187118]: 2025-11-24 14:38:09.959 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:10 compute-0 nova_compute[187118]: 2025-11-24 14:38:10.694 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:13 compute-0 podman[218596]: 2025-11-24 14:38:13.44800818 +0000 UTC m=+0.055118410 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:38:14 compute-0 nova_compute[187118]: 2025-11-24 14:38:14.960 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:15 compute-0 nova_compute[187118]: 2025-11-24 14:38:15.696 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:19 compute-0 ovn_controller[95613]: 2025-11-24T14:38:19Z|00147|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Nov 24 14:38:19 compute-0 nova_compute[187118]: 2025-11-24 14:38:19.962 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:20 compute-0 nova_compute[187118]: 2025-11-24 14:38:20.698 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:21 compute-0 podman[218620]: 2025-11-24 14:38:21.458116859 +0000 UTC m=+0.069783633 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:38:22 compute-0 nova_compute[187118]: 2025-11-24 14:38:22.522 187122 INFO nova.compute.manager [None req-a4e7d77a-cd8f-4478-97ba-21489e843ca1 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Get console output
Nov 24 14:38:22 compute-0 nova_compute[187118]: 2025-11-24 14:38:22.529 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:38:23 compute-0 nova_compute[187118]: 2025-11-24 14:38:23.576 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:23 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:23.577 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:38:23 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:23.578 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:38:23 compute-0 nova_compute[187118]: 2025-11-24 14:38:23.746 187122 DEBUG nova.compute.manager [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:23 compute-0 nova_compute[187118]: 2025-11-24 14:38:23.746 187122 DEBUG nova.compute.manager [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing instance network info cache due to event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:38:23 compute-0 nova_compute[187118]: 2025-11-24 14:38:23.746 187122 DEBUG oslo_concurrency.lockutils [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:38:23 compute-0 nova_compute[187118]: 2025-11-24 14:38:23.747 187122 DEBUG oslo_concurrency.lockutils [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:38:23 compute-0 nova_compute[187118]: 2025-11-24 14:38:23.747 187122 DEBUG nova.network.neutron [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:38:24 compute-0 nova_compute[187118]: 2025-11-24 14:38:24.794 187122 INFO nova.compute.manager [None req-59c45ceb-6f17-4fd2-a0a3-c1ab953f998e ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Get console output
Nov 24 14:38:24 compute-0 nova_compute[187118]: 2025-11-24 14:38:24.798 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:38:24 compute-0 nova_compute[187118]: 2025-11-24 14:38:24.965 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.387 187122 DEBUG nova.network.neutron [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updated VIF entry in instance network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.388 187122 DEBUG nova.network.neutron [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.404 187122 DEBUG oslo_concurrency.lockutils [req-42c04c0c-2a15-495f-aadd-d2ffd01f426d req-627ac4e4-5273-45a9-89dd-6b0dc1d180f5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:38:25 compute-0 podman[218641]: 2025-11-24 14:38:25.485447467 +0000 UTC m=+0.074285386 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:38:25 compute-0 podman[218640]: 2025-11-24 14:38:25.487046761 +0000 UTC m=+0.086539342 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.701 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.826 187122 DEBUG nova.compute.manager [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-unplugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.827 187122 DEBUG oslo_concurrency.lockutils [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.827 187122 DEBUG oslo_concurrency.lockutils [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.828 187122 DEBUG oslo_concurrency.lockutils [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.828 187122 DEBUG nova.compute.manager [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-unplugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.828 187122 WARNING nova.compute.manager [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received unexpected event network-vif-unplugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with vm_state active and task_state None.
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.828 187122 DEBUG nova.compute.manager [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.829 187122 DEBUG oslo_concurrency.lockutils [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.829 187122 DEBUG oslo_concurrency.lockutils [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.829 187122 DEBUG oslo_concurrency.lockutils [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.830 187122 DEBUG nova.compute.manager [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:25 compute-0 nova_compute[187118]: 2025-11-24 14:38:25.830 187122 WARNING nova.compute.manager [req-372221c1-e84e-41ab-99f1-66ede1ea9b8a req-fb7d864e-11b2-4fd9-8648-4fba18533e0f 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received unexpected event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with vm_state active and task_state None.
Nov 24 14:38:26 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:26.581 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:26 compute-0 nova_compute[187118]: 2025-11-24 14:38:26.711 187122 INFO nova.compute.manager [None req-5cb6c92a-95cc-43c8-8fbf-886b5eb4cc3d ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Get console output
Nov 24 14:38:26 compute-0 nova_compute[187118]: 2025-11-24 14:38:26.715 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.840 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.841 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.842 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.842 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.843 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.845 187122 INFO nova.compute.manager [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Terminating instance
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.848 187122 DEBUG nova.compute.manager [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:38:27 compute-0 kernel: tap4bc0703d-e5 (unregistering): left promiscuous mode
Nov 24 14:38:27 compute-0 NetworkManager[55697]: <info>  [1763995107.8761] device (tap4bc0703d-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.885 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:27 compute-0 ovn_controller[95613]: 2025-11-24T14:38:27Z|00148|binding|INFO|Releasing lport 4bc0703d-e5d0-4012-b8af-218ac012eb92 from this chassis (sb_readonly=0)
Nov 24 14:38:27 compute-0 ovn_controller[95613]: 2025-11-24T14:38:27Z|00149|binding|INFO|Setting lport 4bc0703d-e5d0-4012-b8af-218ac012eb92 down in Southbound
Nov 24 14:38:27 compute-0 ovn_controller[95613]: 2025-11-24T14:38:27Z|00150|binding|INFO|Removing iface tap4bc0703d-e5 ovn-installed in OVS
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.890 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:27.898 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:68:e5 10.100.0.11'], port_security=['fa:16:3e:5b:68:e5 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '137f208f-228c-4e2f-9395-79c5d643c17a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a71ff4a1-4692-40dc-a195-bd7cee824485', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23fd8537-aa59-4c32-8488-c8a540b7ddee, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=4bc0703d-e5d0-4012-b8af-218ac012eb92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:38:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:27.901 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 4bc0703d-e5d0-4012-b8af-218ac012eb92 in datapath 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 unbound from our chassis
Nov 24 14:38:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:27.903 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.916 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.926 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.926 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing instance network info cache due to event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.927 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.927 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:38:27 compute-0 nova_compute[187118]: 2025-11-24 14:38:27.927 187122 DEBUG nova.network.neutron [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:38:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:27.933 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[64acc1da-f618-4adb-9637-7c710b57cd28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:27.978 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[88e96849-2186-4b82-981c-9f6ed4450a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:27 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:27.983 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf46f10-eb07-4666-9943-222ae7ce7fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:27 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 24 14:38:27 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 14.422s CPU time.
Nov 24 14:38:27 compute-0 systemd-machined[153483]: Machine qemu-12-instance-0000000c terminated.
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.032 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[fd1256da-d306-4997-a30c-2f8f5fbde720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.061 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b11a16c9-a532-4bdf-8596-71b917b3c281]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f40bf56-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:1a:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331505, 'reachable_time': 33377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218692, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.093 187122 DEBUG nova.compute.manager [req-64be715d-bd87-4617-acf4-00b9829a8416 req-2472855d-464f-4619-bdd3-f2c89b59ad66 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-vif-unplugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.094 187122 DEBUG oslo_concurrency.lockutils [req-64be715d-bd87-4617-acf4-00b9829a8416 req-2472855d-464f-4619-bdd3-f2c89b59ad66 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.093 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[9418a41f-52f2-410e-905d-05975b6b10b4]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap4f40bf56-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331517, 'tstamp': 331517}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218696, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap4f40bf56-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 331520, 'tstamp': 331520}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218696, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.094 187122 DEBUG oslo_concurrency.lockutils [req-64be715d-bd87-4617-acf4-00b9829a8416 req-2472855d-464f-4619-bdd3-f2c89b59ad66 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.095 187122 DEBUG oslo_concurrency.lockutils [req-64be715d-bd87-4617-acf4-00b9829a8416 req-2472855d-464f-4619-bdd3-f2c89b59ad66 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.095 187122 DEBUG nova.compute.manager [req-64be715d-bd87-4617-acf4-00b9829a8416 req-2472855d-464f-4619-bdd3-f2c89b59ad66 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] No waiting events found dispatching network-vif-unplugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.095 187122 DEBUG nova.compute.manager [req-64be715d-bd87-4617-acf4-00b9829a8416 req-2472855d-464f-4619-bdd3-f2c89b59ad66 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-vif-unplugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.096 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f40bf56-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.098 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.106 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f40bf56-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.106 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.107 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.107 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f40bf56-b0, col_values=(('external_ids', {'iface-id': 'aa5ffafb-d507-447e-b6a0-062a4b8e8014'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:28 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:28.108 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.134 187122 INFO nova.virt.libvirt.driver [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Instance destroyed successfully.
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.135 187122 DEBUG nova.objects.instance [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid 137f208f-228c-4e2f-9395-79c5d643c17a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.146 187122 DEBUG nova.virt.libvirt.vif [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:37:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-975957140',display_name='tempest-TestNetworkBasicOps-server-975957140',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-975957140',id=12,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxNy0dBa04REfvaesgczGDro8bdjrxvI6RHFS+sMtDcQUVQcSPAN48ZwR7KM8mvxUtNFIHbPfR/lPzg6WR96yLxUnQdHJnNOkgv1D0zLFDDPkkqNWEl20LKoSGubz5pXQ==',key_name='tempest-TestNetworkBasicOps-1145703243',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:37:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-9chpac4o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:37:47Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=137f208f-228c-4e2f-9395-79c5d643c17a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.147 187122 DEBUG nova.network.os_vif_util [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "address": "fa:16:3e:5b:68:e5", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4bc0703d-e5", "ovs_interfaceid": "4bc0703d-e5d0-4012-b8af-218ac012eb92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.148 187122 DEBUG nova.network.os_vif_util [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.148 187122 DEBUG os_vif [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.150 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.150 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4bc0703d-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.152 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.153 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.156 187122 INFO os_vif [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5b:68:e5,bridge_name='br-int',has_traffic_filtering=True,id=4bc0703d-e5d0-4012-b8af-218ac012eb92,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4bc0703d-e5')
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.157 187122 INFO nova.virt.libvirt.driver [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Deleting instance files /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a_del
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.158 187122 INFO nova.virt.libvirt.driver [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Deletion of /var/lib/nova/instances/137f208f-228c-4e2f-9395-79c5d643c17a_del complete
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.213 187122 INFO nova.compute.manager [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.214 187122 DEBUG oslo.service.loopingcall [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.215 187122 DEBUG nova.compute.manager [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.215 187122 DEBUG nova.network.neutron [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.891 187122 DEBUG nova.network.neutron [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.916 187122 INFO nova.compute.manager [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Took 0.70 seconds to deallocate network for instance.
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.956 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:28 compute-0 nova_compute[187118]: 2025-11-24 14:38:28.957 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.042 187122 DEBUG nova.compute.provider_tree [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.053 187122 DEBUG nova.scheduler.client.report [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.069 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.087 187122 INFO nova.scheduler.client.report [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance 137f208f-228c-4e2f-9395-79c5d643c17a
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.154 187122 DEBUG oslo_concurrency.lockutils [None req-44266dd8-40b6-4c3d-bb6e-b7bc2c0c52cc ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.354 187122 DEBUG nova.network.neutron [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updated VIF entry in instance network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.355 187122 DEBUG nova.network.neutron [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.367 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.368 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.368 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.368 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.368 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.368 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 WARNING nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received unexpected event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with vm_state active and task_state None.
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.369 187122 WARNING nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received unexpected event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with vm_state active and task_state None.
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.370 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-changed-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.370 187122 DEBUG nova.compute.manager [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Refreshing instance network info cache due to event network-changed-4bc0703d-e5d0-4012-b8af-218ac012eb92. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.370 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.370 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.370 187122 DEBUG nova.network.neutron [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Refreshing network info cache for port 4bc0703d-e5d0-4012-b8af-218ac012eb92 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.381 187122 DEBUG nova.compute.utils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Can not refresh info_cache because instance was not found refresh_info_cache_for_instance /usr/lib/python3.9/site-packages/nova/compute/utils.py:1010
Nov 24 14:38:29 compute-0 podman[218713]: 2025-11-24 14:38:29.462091668 +0000 UTC m=+0.061216588 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Nov 24 14:38:29 compute-0 podman[218712]: 2025-11-24 14:38:29.493343674 +0000 UTC m=+0.093090921 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.514 187122 INFO nova.network.neutron [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Port 4bc0703d-e5d0-4012-b8af-218ac012eb92 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.514 187122 DEBUG nova.network.neutron [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.528 187122 DEBUG oslo_concurrency.lockutils [req-7e3947d0-d56a-4f5e-bf61-5d6727c39268 req-9935b0cf-980a-4670-9e11-970f865ba6b5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-137f208f-228c-4e2f-9395-79c5d643c17a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:38:29 compute-0 nova_compute[187118]: 2025-11-24 14:38:29.968 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.294 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.295 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.295 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.295 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.296 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.297 187122 INFO nova.compute.manager [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Terminating instance
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.298 187122 DEBUG nova.compute.manager [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:38:30 compute-0 kernel: tapefa50b36-70 (unregistering): left promiscuous mode
Nov 24 14:38:30 compute-0 NetworkManager[55697]: <info>  [1763995110.3305] device (tapefa50b36-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:38:30 compute-0 ovn_controller[95613]: 2025-11-24T14:38:30Z|00151|binding|INFO|Releasing lport efa50b36-70e9-4adb-b0fb-80e7ba4232c1 from this chassis (sb_readonly=0)
Nov 24 14:38:30 compute-0 ovn_controller[95613]: 2025-11-24T14:38:30Z|00152|binding|INFO|Setting lport efa50b36-70e9-4adb-b0fb-80e7ba4232c1 down in Southbound
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.337 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 ovn_controller[95613]: 2025-11-24T14:38:30Z|00153|binding|INFO|Removing iface tapefa50b36-70 ovn-installed in OVS
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.343 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.346 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:15:57 10.100.0.9'], port_security=['fa:16:3e:a3:15:57 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'e2b8d81d-63e2-4024-80be-476801e2ac7f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '8', 'neutron:security_group_ids': '8da34392-5850-485a-8d0d-5dd31b6fc169', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=23fd8537-aa59-4c32-8488-c8a540b7ddee, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=efa50b36-70e9-4adb-b0fb-80e7ba4232c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.348 104469 INFO neutron.agent.ovn.metadata.agent [-] Port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 in datapath 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 unbound from our chassis
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.349 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.351 187122 DEBUG nova.compute.manager [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.350 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[fea9f40b-d6df-4fd5-87fc-8ef660144bac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.351 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 namespace which is not needed anymore
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.352 187122 DEBUG oslo_concurrency.lockutils [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.352 187122 DEBUG oslo_concurrency.lockutils [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.354 187122 DEBUG oslo_concurrency.lockutils [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "137f208f-228c-4e2f-9395-79c5d643c17a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.354 187122 DEBUG nova.compute.manager [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] No waiting events found dispatching network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.355 187122 WARNING nova.compute.manager [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received unexpected event network-vif-plugged-4bc0703d-e5d0-4012-b8af-218ac012eb92 for instance with vm_state deleted and task_state None.
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.355 187122 DEBUG nova.compute.manager [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Received event network-vif-deleted-4bc0703d-e5d0-4012-b8af-218ac012eb92 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.356 187122 DEBUG nova.compute.manager [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.356 187122 DEBUG nova.compute.manager [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing instance network info cache due to event network-changed-efa50b36-70e9-4adb-b0fb-80e7ba4232c1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.356 187122 DEBUG oslo_concurrency.lockutils [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.357 187122 DEBUG oslo_concurrency.lockutils [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.357 187122 DEBUG nova.network.neutron [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Refreshing network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.370 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Nov 24 14:38:30 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.473s CPU time.
Nov 24 14:38:30 compute-0 systemd-machined[153483]: Machine qemu-11-instance-0000000b terminated.
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.507 187122 DEBUG nova.compute.manager [req-a6251dea-f9fa-4365-9a2a-c16497c94de3 req-c066725e-9fe8-4028-81de-e97c81e17994 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-unplugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.507 187122 DEBUG oslo_concurrency.lockutils [req-a6251dea-f9fa-4365-9a2a-c16497c94de3 req-c066725e-9fe8-4028-81de-e97c81e17994 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.509 187122 DEBUG oslo_concurrency.lockutils [req-a6251dea-f9fa-4365-9a2a-c16497c94de3 req-c066725e-9fe8-4028-81de-e97c81e17994 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.510 187122 DEBUG oslo_concurrency.lockutils [req-a6251dea-f9fa-4365-9a2a-c16497c94de3 req-c066725e-9fe8-4028-81de-e97c81e17994 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.510 187122 DEBUG nova.compute.manager [req-a6251dea-f9fa-4365-9a2a-c16497c94de3 req-c066725e-9fe8-4028-81de-e97c81e17994 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-unplugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.511 187122 DEBUG nova.compute.manager [req-a6251dea-f9fa-4365-9a2a-c16497c94de3 req-c066725e-9fe8-4028-81de-e97c81e17994 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-unplugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:38:30 compute-0 NetworkManager[55697]: <info>  [1763995110.5264] manager: (tapefa50b36-70): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Nov 24 14:38:30 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [NOTICE]   (218314) : haproxy version is 2.8.14-c23fe91
Nov 24 14:38:30 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [NOTICE]   (218314) : path to executable is /usr/sbin/haproxy
Nov 24 14:38:30 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [WARNING]  (218314) : Exiting Master process...
Nov 24 14:38:30 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [WARNING]  (218314) : Exiting Master process...
Nov 24 14:38:30 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [ALERT]    (218314) : Current worker (218316) exited with code 143 (Terminated)
Nov 24 14:38:30 compute-0 neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6[218310]: [WARNING]  (218314) : All workers exited. Exiting... (0)
Nov 24 14:38:30 compute-0 systemd[1]: libpod-9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11.scope: Deactivated successfully.
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.578 187122 INFO nova.virt.libvirt.driver [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Instance destroyed successfully.
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.579 187122 DEBUG nova.objects.instance [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid e2b8d81d-63e2-4024-80be-476801e2ac7f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:38:30 compute-0 podman[218779]: 2025-11-24 14:38:30.581520958 +0000 UTC m=+0.069807754 container died 9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.592 187122 DEBUG nova.virt.libvirt.vif [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:37:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-242850204',display_name='tempest-TestNetworkBasicOps-server-242850204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-242850204',id=11,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAlHrXi81T9a9pzlsIgsG5PWijsu3mk02tp+/Y4nJIjPkb//9Qu9T4jg7ZzsFaSrbIgz/kmPIQgOPrmf1cgC83C2QFTy/JgfN/28UBP5yyShoUahaNQHScKtyj+wzILH5Q==',key_name='tempest-TestNetworkBasicOps-110487052',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:37:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-hqtoo21g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:37:28Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=e2b8d81d-63e2-4024-80be-476801e2ac7f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.593 187122 DEBUG nova.network.os_vif_util [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.594 187122 DEBUG nova.network.os_vif_util [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.595 187122 DEBUG os_vif [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.598 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.598 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefa50b36-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.600 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.607 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.611 187122 INFO os_vif [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:15:57,bridge_name='br-int',has_traffic_filtering=True,id=efa50b36-70e9-4adb-b0fb-80e7ba4232c1,network=Network(4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapefa50b36-70')
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.612 187122 INFO nova.virt.libvirt.driver [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Deleting instance files /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f_del
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.613 187122 INFO nova.virt.libvirt.driver [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Deletion of /var/lib/nova/instances/e2b8d81d-63e2-4024-80be-476801e2ac7f_del complete
Nov 24 14:38:30 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11-userdata-shm.mount: Deactivated successfully.
Nov 24 14:38:30 compute-0 systemd[1]: var-lib-containers-storage-overlay-615bc42fd8ce61e8917f6387703b85de5f3dfb3fe698b370d819eeb691e704df-merged.mount: Deactivated successfully.
Nov 24 14:38:30 compute-0 podman[218779]: 2025-11-24 14:38:30.638497629 +0000 UTC m=+0.126784395 container cleanup 9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.659 187122 INFO nova.compute.manager [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Took 0.36 seconds to destroy the instance on the hypervisor.
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.659 187122 DEBUG oslo.service.loopingcall [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.660 187122 DEBUG nova.compute.manager [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.660 187122 DEBUG nova.network.neutron [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:38:30 compute-0 systemd[1]: libpod-conmon-9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11.scope: Deactivated successfully.
Nov 24 14:38:30 compute-0 podman[218820]: 2025-11-24 14:38:30.734718765 +0000 UTC m=+0.062166854 container remove 9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.741 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[47bff489-1c16-45c3-90e2-2e8e73e0f254]: (4, ('Mon Nov 24 02:38:30 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 (9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11)\n9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11\nMon Nov 24 02:38:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 (9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11)\n9bdcf279c5a9f00f732d590146e0931f1e937b71ef998eac054a2a8c18c99e11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.743 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[29e7fae1-edca-4d6f-bed5-ddce7cd98812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.744 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f40bf56-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.746 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 kernel: tap4f40bf56-b0: left promiscuous mode
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.751 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[de065654-0226-40f6-bd09-f15b75e36c94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 nova_compute[187118]: 2025-11-24 14:38:30.762 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.777 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[027da9ae-37f2-4111-a689-0b8c52d11688]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.778 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[3b259c31-c591-4e0f-b174-3ed12ecf224f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.802 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4704dc-ed6a-431d-85b2-60b49f660960]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 331497, 'reachable_time': 43705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218838, 'error': None, 'target': 'ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.806 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:38:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:30.806 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b97d43-1198-4df4-8a3e-3c988c2c7fb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d4f40bf56\x2db3f7\x2d4118\x2dbb4e\x2dd593fc7a9aa6.mount: Deactivated successfully.
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.101 187122 DEBUG nova.network.neutron [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.115 187122 INFO nova.compute.manager [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Took 0.46 seconds to deallocate network for instance.
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.153 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.154 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.196 187122 DEBUG nova.compute.provider_tree [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.210 187122 DEBUG nova.scheduler.client.report [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.230 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.256 187122 INFO nova.scheduler.client.report [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance e2b8d81d-63e2-4024-80be-476801e2ac7f
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.346 187122 DEBUG oslo_concurrency.lockutils [None req-1010bdbe-6d2e-43b9-9826-ec1da33049ce ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.359 187122 DEBUG nova.network.neutron [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updated VIF entry in instance network info cache for port efa50b36-70e9-4adb-b0fb-80e7ba4232c1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.360 187122 DEBUG nova.network.neutron [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Updating instance_info_cache with network_info: [{"id": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "address": "fa:16:3e:a3:15:57", "network": {"id": "4f40bf56-b3f7-4118-bb4e-d593fc7a9aa6", "bridge": "br-int", "label": "tempest-network-smoke--306412838", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapefa50b36-70", "ovs_interfaceid": "efa50b36-70e9-4adb-b0fb-80e7ba4232c1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:31 compute-0 nova_compute[187118]: 2025-11-24 14:38:31.382 187122 DEBUG oslo_concurrency.lockutils [req-34c65032-9302-4ef5-b8fd-896860965173 req-c40ffbc4-0d41-44b0-ab1f-cb229a2596c2 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-e2b8d81d-63e2-4024-80be-476801e2ac7f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.578 187122 DEBUG nova.compute.manager [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.581 187122 DEBUG oslo_concurrency.lockutils [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.581 187122 DEBUG oslo_concurrency.lockutils [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.582 187122 DEBUG oslo_concurrency.lockutils [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "e2b8d81d-63e2-4024-80be-476801e2ac7f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.582 187122 DEBUG nova.compute.manager [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] No waiting events found dispatching network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.582 187122 WARNING nova.compute.manager [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received unexpected event network-vif-plugged-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 for instance with vm_state deleted and task_state None.
Nov 24 14:38:32 compute-0 nova_compute[187118]: 2025-11-24 14:38:32.583 187122 DEBUG nova.compute.manager [req-1fb99110-900e-48fa-ac42-2dda04db9e38 req-120d8bef-c950-4e5b-9607-ffc973d668ff 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Received event network-vif-deleted-efa50b36-70e9-4adb-b0fb-80e7ba4232c1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:34 compute-0 nova_compute[187118]: 2025-11-24 14:38:34.522 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:34 compute-0 nova_compute[187118]: 2025-11-24 14:38:34.588 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:34 compute-0 nova_compute[187118]: 2025-11-24 14:38:34.970 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:35 compute-0 nova_compute[187118]: 2025-11-24 14:38:35.601 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:37 compute-0 podman[218840]: 2025-11-24 14:38:37.463510279 +0000 UTC m=+0.068881797 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:38:39 compute-0 nova_compute[187118]: 2025-11-24 14:38:39.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:39 compute-0 nova_compute[187118]: 2025-11-24 14:38:39.971 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:40 compute-0 nova_compute[187118]: 2025-11-24 14:38:40.604 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:40 compute-0 nova_compute[187118]: 2025-11-24 14:38:40.792 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:40 compute-0 nova_compute[187118]: 2025-11-24 14:38:40.811 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:41 compute-0 nova_compute[187118]: 2025-11-24 14:38:41.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:41 compute-0 nova_compute[187118]: 2025-11-24 14:38:41.823 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:41 compute-0 nova_compute[187118]: 2025-11-24 14:38:41.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:41 compute-0 nova_compute[187118]: 2025-11-24 14:38:41.824 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:41 compute-0 nova_compute[187118]: 2025-11-24 14:38:41.825 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.095 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.097 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5773MB free_disk=73.45868301391602GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.097 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.098 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.162 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.163 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.190 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.209 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.232 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:38:42 compute-0 nova_compute[187118]: 2025-11-24 14:38:42.233 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.130 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763995108.1280806, 137f208f-228c-4e2f-9395-79c5d643c17a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.131 187122 INFO nova.compute.manager [-] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] VM Stopped (Lifecycle Event)
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.151 187122 DEBUG nova.compute.manager [None req-a8c39821-dffd-46c9-8005-30db1d8a9412 - - - - - -] [instance: 137f208f-228c-4e2f-9395-79c5d643c17a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.233 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.234 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.235 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.247 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.248 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:43 compute-0 nova_compute[187118]: 2025-11-24 14:38:43.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:38:44 compute-0 podman[218865]: 2025-11-24 14:38:44.49005852 +0000 UTC m=+0.085320928 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:38:44 compute-0 nova_compute[187118]: 2025-11-24 14:38:44.973 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:45 compute-0 nova_compute[187118]: 2025-11-24 14:38:45.570 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763995110.5697224, e2b8d81d-63e2-4024-80be-476801e2ac7f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:38:45 compute-0 nova_compute[187118]: 2025-11-24 14:38:45.571 187122 INFO nova.compute.manager [-] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] VM Stopped (Lifecycle Event)
Nov 24 14:38:45 compute-0 nova_compute[187118]: 2025-11-24 14:38:45.595 187122 DEBUG nova.compute.manager [None req-67d71a72-1bfd-4ad6-8a5d-49e035511b0a - - - - - -] [instance: e2b8d81d-63e2-4024-80be-476801e2ac7f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:38:45 compute-0 nova_compute[187118]: 2025-11-24 14:38:45.607 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:45 compute-0 nova_compute[187118]: 2025-11-24 14:38:45.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:47 compute-0 nova_compute[187118]: 2025-11-24 14:38:47.790 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:49 compute-0 nova_compute[187118]: 2025-11-24 14:38:49.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:38:49 compute-0 nova_compute[187118]: 2025-11-24 14:38:49.975 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:50 compute-0 nova_compute[187118]: 2025-11-24 14:38:50.609 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.204 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.205 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.220 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.282 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.283 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.290 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.290 187122 INFO nova.compute.claims [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Claim successful on node compute-0.ctlplane.example.com
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.415 187122 DEBUG nova.compute.provider_tree [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.439 187122 DEBUG nova.scheduler.client.report [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.460 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.461 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.612 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.612 187122 DEBUG nova.network.neutron [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.624 187122 INFO nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.642 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.770 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.771 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.771 187122 INFO nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Creating image(s)
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.772 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "/var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.772 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.773 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "/var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.785 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.839 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.840 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "934740050c9d8b8b6777b6dbee3c76c574717cca" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.840 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.851 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.904 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.905 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.938 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca,backing_fmt=raw /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.939 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "934740050c9d8b8b6777b6dbee3c76c574717cca" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.940 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.995 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/934740050c9d8b8b6777b6dbee3c76c574717cca --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.996 187122 DEBUG nova.virt.disk.api [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Checking if we can resize image /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Nov 24 14:38:51 compute-0 nova_compute[187118]: 2025-11-24 14:38:51.996 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.049 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.051 187122 DEBUG nova.virt.disk.api [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Cannot resize image /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.051 187122 DEBUG nova.objects.instance [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'migration_context' on Instance uuid deec7f7a-de1e-4cb1-b74c-f47abc760797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.066 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.066 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Ensure instance console log exists: /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.067 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.067 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.067 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:52 compute-0 nova_compute[187118]: 2025-11-24 14:38:52.391 187122 DEBUG nova.policy [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef366911f162401f897bcd979ad0c45a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Nov 24 14:38:52 compute-0 podman[218906]: 2025-11-24 14:38:52.440356749 +0000 UTC m=+0.047291016 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:38:53 compute-0 nova_compute[187118]: 2025-11-24 14:38:53.788 187122 DEBUG nova.network.neutron [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Successfully created port: 1f30b9ed-543c-4644-a445-5d12cae7ae11 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Nov 24 14:38:54 compute-0 nova_compute[187118]: 2025-11-24 14:38:54.976 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.336 187122 DEBUG nova.network.neutron [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Successfully updated port: 1f30b9ed-543c-4644-a445-5d12cae7ae11 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.352 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.353 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquired lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.353 187122 DEBUG nova.network.neutron [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.455 187122 DEBUG nova.compute.manager [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-changed-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.455 187122 DEBUG nova.compute.manager [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Refreshing instance network info cache due to event network-changed-1f30b9ed-543c-4644-a445-5d12cae7ae11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.455 187122 DEBUG oslo_concurrency.lockutils [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.486 187122 DEBUG nova.network.neutron [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Nov 24 14:38:55 compute-0 nova_compute[187118]: 2025-11-24 14:38:55.612 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.406 187122 DEBUG nova.network.neutron [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updating instance_info_cache with network_info: [{"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:56 compute-0 podman[218926]: 2025-11-24 14:38:56.457385836 +0000 UTC m=+0.059603864 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.459 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Releasing lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.460 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Instance network_info: |[{"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.460 187122 DEBUG oslo_concurrency.lockutils [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.461 187122 DEBUG nova.network.neutron [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Refreshing network info cache for port 1f30b9ed-543c-4644-a445-5d12cae7ae11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.464 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Start _get_guest_xml network_info=[{"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'encryption_format': None, 'size': 0, 'boot_index': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'guest_format': None, 'device_name': '/dev/vda', 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '54a328f6-92ea-410e-beaf-ba04bab9ef9a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Nov 24 14:38:56 compute-0 podman[218925]: 2025-11-24 14:38:56.464566903 +0000 UTC m=+0.070231005 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.468 187122 WARNING nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.472 187122 DEBUG nova.virt.libvirt.host [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.473 187122 DEBUG nova.virt.libvirt.host [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.475 187122 DEBUG nova.virt.libvirt.host [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.476 187122 DEBUG nova.virt.libvirt.host [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.476 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.477 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-24T14:28:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6e922a91-f8b6-466b-9721-3ed72f453145',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-24T14:28:51Z,direct_url=<?>,disk_format='qcow2',id=54a328f6-92ea-410e-beaf-ba04bab9ef9a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5f2c2c59dcfb47f49d179fade7a63aba',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-24T14:28:52Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.477 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.477 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.478 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.478 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.478 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.478 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.479 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.479 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.479 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.479 187122 DEBUG nova.virt.hardware [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.483 187122 DEBUG nova.virt.libvirt.vif [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:38:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2006939476',display_name='tempest-TestNetworkBasicOps-server-2006939476',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2006939476',id=13,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAPfeSMXBNT0bS11P2pN5ym+CFCkJn5RROf7lJr7FyNG/zmQHuAnxdmdonsK141KqjY3HQ19kOh/CmvbHf+0yESTfYy3p2uG7QVdhkDKrnlelfdL0HfnuaDNfHjfClKXA==',key_name='tempest-TestNetworkBasicOps-806601726',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-mz013yv8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:38:51Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=deec7f7a-de1e-4cb1-b74c-f47abc760797,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.483 187122 DEBUG nova.network.os_vif_util [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.484 187122 DEBUG nova.network.os_vif_util [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.485 187122 DEBUG nova.objects.instance [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'pci_devices' on Instance uuid deec7f7a-de1e-4cb1-b74c-f47abc760797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.495 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] End _get_guest_xml xml=<domain type="kvm">
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <uuid>deec7f7a-de1e-4cb1-b74c-f47abc760797</uuid>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <name>instance-0000000d</name>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <memory>131072</memory>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <vcpu>1</vcpu>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <metadata>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:name>tempest-TestNetworkBasicOps-server-2006939476</nova:name>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:creationTime>2025-11-24 14:38:56</nova:creationTime>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:flavor name="m1.nano">
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:memory>128</nova:memory>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:disk>1</nova:disk>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:swap>0</nova:swap>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:ephemeral>0</nova:ephemeral>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:vcpus>1</nova:vcpus>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       </nova:flavor>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:owner>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:user uuid="ef366911f162401f897bcd979ad0c45a">tempest-TestNetworkBasicOps-449241238-project-member</nova:user>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:project uuid="0b17c7cc946a4f86aea7e5b323e88562">tempest-TestNetworkBasicOps-449241238</nova:project>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       </nova:owner>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:root type="image" uuid="54a328f6-92ea-410e-beaf-ba04bab9ef9a"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <nova:ports>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         <nova:port uuid="1f30b9ed-543c-4644-a445-5d12cae7ae11">
Nov 24 14:38:56 compute-0 nova_compute[187118]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:         </nova:port>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       </nova:ports>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </nova:instance>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </metadata>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <sysinfo type="smbios">
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <system>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <entry name="manufacturer">RDO</entry>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <entry name="product">OpenStack Compute</entry>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <entry name="serial">deec7f7a-de1e-4cb1-b74c-f47abc760797</entry>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <entry name="uuid">deec7f7a-de1e-4cb1-b74c-f47abc760797</entry>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <entry name="family">Virtual Machine</entry>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </system>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </sysinfo>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <os>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <type arch="x86_64" machine="q35">hvm</type>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <boot dev="hd"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <smbios mode="sysinfo"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </os>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <features>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <acpi/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <apic/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <vmcoreinfo/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </features>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <clock offset="utc">
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <timer name="pit" tickpolicy="delay"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <timer name="rtc" tickpolicy="catchup"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <timer name="hpet" present="no"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </clock>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <cpu mode="host-model" match="exact">
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <topology sockets="1" cores="1" threads="1"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </cpu>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   <devices>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <disk type="file" device="disk">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <driver name="qemu" type="qcow2" cache="none"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <target dev="vda" bus="virtio"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <disk type="file" device="cdrom">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <driver name="qemu" type="raw" cache="none"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <source file="/var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.config"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <target dev="sda" bus="sata"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </disk>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <interface type="ethernet">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <mac address="fa:16:3e:12:c7:0b"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <driver name="vhost" rx_queue_size="512"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <mtu size="1442"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <target dev="tap1f30b9ed-54"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </interface>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <serial type="pty">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <log file="/var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/console.log" append="off"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </serial>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <video>
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <model type="virtio"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </video>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <input type="tablet" bus="usb"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <rng model="virtio">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <backend model="random">/dev/urandom</backend>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </rng>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="pci" model="pcie-root-port"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <controller type="usb" index="0"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     <memballoon model="virtio">
Nov 24 14:38:56 compute-0 nova_compute[187118]:       <stats period="10"/>
Nov 24 14:38:56 compute-0 nova_compute[187118]:     </memballoon>
Nov 24 14:38:56 compute-0 nova_compute[187118]:   </devices>
Nov 24 14:38:56 compute-0 nova_compute[187118]: </domain>
Nov 24 14:38:56 compute-0 nova_compute[187118]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.496 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Preparing to wait for external event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.497 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.497 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.497 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.498 187122 DEBUG nova.virt.libvirt.vif [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-24T14:38:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2006939476',display_name='tempest-TestNetworkBasicOps-server-2006939476',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2006939476',id=13,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAPfeSMXBNT0bS11P2pN5ym+CFCkJn5RROf7lJr7FyNG/zmQHuAnxdmdonsK141KqjY3HQ19kOh/CmvbHf+0yESTfYy3p2uG7QVdhkDKrnlelfdL0HfnuaDNfHjfClKXA==',key_name='tempest-TestNetworkBasicOps-806601726',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-mz013yv8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-24T14:38:51Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=deec7f7a-de1e-4cb1-b74c-f47abc760797,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.498 187122 DEBUG nova.network.os_vif_util [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.499 187122 DEBUG nova.network.os_vif_util [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.499 187122 DEBUG os_vif [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.500 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.500 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.501 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.503 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.503 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f30b9ed-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.504 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f30b9ed-54, col_values=(('external_ids', {'iface-id': '1f30b9ed-543c-4644-a445-5d12cae7ae11', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:12:c7:0b', 'vm-uuid': 'deec7f7a-de1e-4cb1-b74c-f47abc760797'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.506 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:56 compute-0 NetworkManager[55697]: <info>  [1763995136.5070] manager: (tap1f30b9ed-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.509 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.512 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.513 187122 INFO os_vif [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54')
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.554 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.554 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.554 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] No VIF found with MAC fa:16:3e:12:c7:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.555 187122 INFO nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Using config drive
Nov 24 14:38:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:56.665 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:56.665 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:56.665 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.880 187122 INFO nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Creating config drive at /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.config
Nov 24 14:38:56 compute-0 nova_compute[187118]: 2025-11-24 14:38:56.889 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuubzchqs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.031 187122 DEBUG oslo_concurrency.processutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuubzchqs" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:38:57 compute-0 kernel: tap1f30b9ed-54: entered promiscuous mode
Nov 24 14:38:57 compute-0 NetworkManager[55697]: <info>  [1763995137.0919] manager: (tap1f30b9ed-54): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.093 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:57 compute-0 ovn_controller[95613]: 2025-11-24T14:38:57Z|00154|binding|INFO|Claiming lport 1f30b9ed-543c-4644-a445-5d12cae7ae11 for this chassis.
Nov 24 14:38:57 compute-0 ovn_controller[95613]: 2025-11-24T14:38:57Z|00155|binding|INFO|1f30b9ed-543c-4644-a445-5d12cae7ae11: Claiming fa:16:3e:12:c7:0b 10.100.0.7
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.104 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.114 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:c7:0b 10.100.0.7'], port_security=['fa:16:3e:12:c7:0b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'deec7f7a-de1e-4cb1-b74c-f47abc760797', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '2', 'neutron:security_group_ids': '221f1bcc-2670-4c6d-8839-df20c44d3b24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fdd0f0e-04a1-4ef2-9a6e-87187e079b44, chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=1f30b9ed-543c-4644-a445-5d12cae7ae11) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.115 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 1f30b9ed-543c-4644-a445-5d12cae7ae11 in datapath 8b997ab9-49f8-499b-8f6f-e77ce99c144f bound to our chassis
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.116 104469 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b997ab9-49f8-499b-8f6f-e77ce99c144f
Nov 24 14:38:57 compute-0 systemd-udevd[218980]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:38:57 compute-0 systemd-machined[153483]: New machine qemu-13-instance-0000000d.
Nov 24 14:38:57 compute-0 NetworkManager[55697]: <info>  [1763995137.1337] device (tap1f30b9ed-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 24 14:38:57 compute-0 NetworkManager[55697]: <info>  [1763995137.1344] device (tap1f30b9ed-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.135 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[26219aa7-bdef-4258-bbf7-b1ac7039254b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.136 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b997ab9-41 in ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.138 213394 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b997ab9-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.138 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[484337e0-5f0d-4e36-bd6c-7e67d52d8d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.139 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[59505eae-14a4-453a-958b-371cadeda9e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.154 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[3c3a3368-b319-4dbd-baea-7337a4f610e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_controller[95613]: 2025-11-24T14:38:57Z|00156|binding|INFO|Setting lport 1f30b9ed-543c-4644-a445-5d12cae7ae11 ovn-installed in OVS
Nov 24 14:38:57 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Nov 24 14:38:57 compute-0 ovn_controller[95613]: 2025-11-24T14:38:57Z|00157|binding|INFO|Setting lport 1f30b9ed-543c-4644-a445-5d12cae7ae11 up in Southbound
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.160 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.183 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[4cbf2188-272a-456c-b84f-7b4a9cbed0fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.209 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[6dddacaa-ec62-4174-bea6-fdd847a31c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 systemd-udevd[218984]: Network interface NamePolicy= disabled on kernel command line.
Nov 24 14:38:57 compute-0 NetworkManager[55697]: <info>  [1763995137.2151] manager: (tap8b997ab9-40): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.214 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[ef952c16-c400-4f27-b06e-183c053cf969]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.241 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[7433f1f0-a86e-4215-80bf-c331671f1125]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.245 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[041ec81a-8271-4261-8e73-9eb9f7981f2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 NetworkManager[55697]: <info>  [1763995137.2622] device (tap8b997ab9-40): carrier: link connected
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.266 213415 DEBUG oslo.privsep.daemon [-] privsep: reply[69fb811b-872f-44e0-829d-58270b71c83c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.280 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[139d4c97-3035-4675-bc83-c8de1cdebed2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b997ab9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:d7:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340469, 'reachable_time': 28424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219014, 'error': None, 'target': 'ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.292 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a44de0c0-2e85-45f0-ad05-895a248f92f5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec9:d702'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340469, 'tstamp': 340469}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219015, 'error': None, 'target': 'ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.312 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[df9138e2-a9a7-49d5-93f9-01909bde4ccd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b997ab9-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c9:d7:02'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340469, 'reachable_time': 28424, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219016, 'error': None, 'target': 'ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.345 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[da9a94ff-75c4-4731-94d3-67c64149b758]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.401 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e6aa08ba-f64f-467a-b8ed-2265bf8f01de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.402 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b997ab9-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.402 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.403 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b997ab9-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.404 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:57 compute-0 kernel: tap8b997ab9-40: entered promiscuous mode
Nov 24 14:38:57 compute-0 NetworkManager[55697]: <info>  [1763995137.4051] manager: (tap8b997ab9-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.411 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b997ab9-40, col_values=(('external_ids', {'iface-id': 'ce1a6e8f-888a-4b1b-8b90-c49290e66fa2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.412 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:57 compute-0 ovn_controller[95613]: 2025-11-24T14:38:57Z|00158|binding|INFO|Releasing lport ce1a6e8f-888a-4b1b-8b90-c49290e66fa2 from this chassis (sb_readonly=0)
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.423 104469 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b997ab9-49f8-499b-8f6f-e77ce99c144f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b997ab9-49f8-499b-8f6f-e77ce99c144f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.424 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.424 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[181e2372-29d4-41f4-88cc-80e765f71b0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.424 104469 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: global
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     log         /dev/log local0 debug
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     log-tag     haproxy-metadata-proxy-8b997ab9-49f8-499b-8f6f-e77ce99c144f
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     user        root
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     group       root
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     maxconn     1024
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     pidfile     /var/lib/neutron/external/pids/8b997ab9-49f8-499b-8f6f-e77ce99c144f.pid.haproxy
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     daemon
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: defaults
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     log global
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     mode http
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     option httplog
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     option dontlognull
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     option http-server-close
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     option forwardfor
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     retries                 3
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     timeout http-request    30s
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     timeout connect         30s
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     timeout client          32s
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     timeout server          32s
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     timeout http-keep-alive 30s
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: listen listener
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     bind 169.254.169.254:80
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     server metadata /var/lib/neutron/metadata_proxy
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:     http-request add-header X-OVN-Network-ID 8b997ab9-49f8-499b-8f6f-e77ce99c144f
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Nov 24 14:38:57 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:38:57.425 104469 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'env', 'PROCESS_TAG=haproxy-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b997ab9-49f8-499b-8f6f-e77ce99c144f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.558 187122 DEBUG nova.compute.manager [req-a79e5345-1bfc-4425-ac30-3c2dac752af8 req-89c4d1a7-9ce3-490b-a6b4-4cc80b8a1e79 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.559 187122 DEBUG oslo_concurrency.lockutils [req-a79e5345-1bfc-4425-ac30-3c2dac752af8 req-89c4d1a7-9ce3-490b-a6b4-4cc80b8a1e79 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.560 187122 DEBUG oslo_concurrency.lockutils [req-a79e5345-1bfc-4425-ac30-3c2dac752af8 req-89c4d1a7-9ce3-490b-a6b4-4cc80b8a1e79 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.560 187122 DEBUG oslo_concurrency.lockutils [req-a79e5345-1bfc-4425-ac30-3c2dac752af8 req-89c4d1a7-9ce3-490b-a6b4-4cc80b8a1e79 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.561 187122 DEBUG nova.compute.manager [req-a79e5345-1bfc-4425-ac30-3c2dac752af8 req-89c4d1a7-9ce3-490b-a6b4-4cc80b8a1e79 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Processing event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.717 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.719 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995137.7168872, deec7f7a-de1e-4cb1-b74c-f47abc760797 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.719 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] VM Started (Lifecycle Event)
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.723 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.726 187122 INFO nova.virt.libvirt.driver [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Instance spawned successfully.
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.727 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.739 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.746 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.749 187122 DEBUG nova.network.neutron [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updated VIF entry in instance network info cache for port 1f30b9ed-543c-4644-a445-5d12cae7ae11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.749 187122 DEBUG nova.network.neutron [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updating instance_info_cache with network_info: [{"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.751 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.751 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.752 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.752 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.753 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.753 187122 DEBUG nova.virt.libvirt.driver [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.776 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.777 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995137.7172148, deec7f7a-de1e-4cb1-b74c-f47abc760797 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.777 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] VM Paused (Lifecycle Event)
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.778 187122 DEBUG oslo_concurrency.lockutils [req-3c861e15-155c-4c34-aeca-5fc79070eea0 req-7671419d-e66c-4ee7-bad7-ff82dbd90ad8 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.802 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.805 187122 DEBUG nova.virt.driver [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] Emitting event <LifecycleEvent: 1763995137.7212086, deec7f7a-de1e-4cb1-b74c-f47abc760797 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.806 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] VM Resumed (Lifecycle Event)
Nov 24 14:38:57 compute-0 podman[219055]: 2025-11-24 14:38:57.809151202 +0000 UTC m=+0.063215004 container create d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.812 187122 INFO nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Took 6.04 seconds to spawn the instance on the hypervisor.
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.812 187122 DEBUG nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.818 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.826 187122 DEBUG nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Nov 24 14:38:57 compute-0 systemd[1]: Started libpod-conmon-d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b.scope.
Nov 24 14:38:57 compute-0 podman[219055]: 2025-11-24 14:38:57.767523041 +0000 UTC m=+0.021586833 image pull 1579eb8af8e4bc6d332a87a6e64650b1ebece1e7fc815782917ed57a649216c9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.868 187122 INFO nova.compute.manager [None req-22472df3-2424-420a-82eb-603bfc889503 - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] During sync_power_state the instance has a pending task (spawning). Skip.
Nov 24 14:38:57 compute-0 systemd[1]: Started libcrun container.
Nov 24 14:38:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87241737fbe35adba498c23615e187a8d0d75d90cf39c3186dad47b5e38f3c32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.909 187122 INFO nova.compute.manager [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Took 6.65 seconds to build instance.
Nov 24 14:38:57 compute-0 podman[219055]: 2025-11-24 14:38:57.91895456 +0000 UTC m=+0.173018352 container init d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:38:57 compute-0 nova_compute[187118]: 2025-11-24 14:38:57.926 187122 DEBUG oslo_concurrency.lockutils [None req-83b9d9b2-1270-463a-9fab-77ce20e24afa ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:57 compute-0 podman[219055]: 2025-11-24 14:38:57.925242552 +0000 UTC m=+0.179306344 container start d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118)
Nov 24 14:38:57 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [NOTICE]   (219074) : New worker (219076) forked
Nov 24 14:38:57 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [NOTICE]   (219074) : Loading success.
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.614 187122 DEBUG nova.compute.manager [req-53f2afc7-2b5a-4966-94b9-b41ec65a7040 req-e42e87c0-ad97-46c5-b5a9-02b9dd1182e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.614 187122 DEBUG oslo_concurrency.lockutils [req-53f2afc7-2b5a-4966-94b9-b41ec65a7040 req-e42e87c0-ad97-46c5-b5a9-02b9dd1182e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.614 187122 DEBUG oslo_concurrency.lockutils [req-53f2afc7-2b5a-4966-94b9-b41ec65a7040 req-e42e87c0-ad97-46c5-b5a9-02b9dd1182e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.614 187122 DEBUG oslo_concurrency.lockutils [req-53f2afc7-2b5a-4966-94b9-b41ec65a7040 req-e42e87c0-ad97-46c5-b5a9-02b9dd1182e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.615 187122 DEBUG nova.compute.manager [req-53f2afc7-2b5a-4966-94b9-b41ec65a7040 req-e42e87c0-ad97-46c5-b5a9-02b9dd1182e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] No waiting events found dispatching network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.615 187122 WARNING nova.compute.manager [req-53f2afc7-2b5a-4966-94b9-b41ec65a7040 req-e42e87c0-ad97-46c5-b5a9-02b9dd1182e5 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received unexpected event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 for instance with vm_state active and task_state None.
Nov 24 14:38:59 compute-0 ovn_controller[95613]: 2025-11-24T14:38:59Z|00159|binding|INFO|Releasing lport ce1a6e8f-888a-4b1b-8b90-c49290e66fa2 from this chassis (sb_readonly=0)
Nov 24 14:38:59 compute-0 nova_compute[187118]: 2025-11-24 14:38:59.977 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:38:59 compute-0 NetworkManager[55697]: <info>  [1763995139.9785] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 24 14:38:59 compute-0 NetworkManager[55697]: <info>  [1763995139.9804] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.013 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:00 compute-0 ovn_controller[95613]: 2025-11-24T14:39:00Z|00160|binding|INFO|Releasing lport ce1a6e8f-888a-4b1b-8b90-c49290e66fa2 from this chassis (sb_readonly=0)
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.017 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.025 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:00 compute-0 podman[219087]: 2025-11-24 14:39:00.455925187 +0000 UTC m=+0.060081627 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, release=1755695350)
Nov 24 14:39:00 compute-0 podman[219086]: 2025-11-24 14:39:00.473401396 +0000 UTC m=+0.082440139 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.616 187122 DEBUG nova.compute.manager [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-changed-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.616 187122 DEBUG nova.compute.manager [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Refreshing instance network info cache due to event network-changed-1f30b9ed-543c-4644-a445-5d12cae7ae11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.617 187122 DEBUG oslo_concurrency.lockutils [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.617 187122 DEBUG oslo_concurrency.lockutils [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:39:00 compute-0 nova_compute[187118]: 2025-11-24 14:39:00.617 187122 DEBUG nova.network.neutron [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Refreshing network info cache for port 1f30b9ed-543c-4644-a445-5d12cae7ae11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:39:01 compute-0 nova_compute[187118]: 2025-11-24 14:39:01.506 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:01 compute-0 nova_compute[187118]: 2025-11-24 14:39:01.700 187122 DEBUG nova.network.neutron [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updated VIF entry in instance network info cache for port 1f30b9ed-543c-4644-a445-5d12cae7ae11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:39:01 compute-0 nova_compute[187118]: 2025-11-24 14:39:01.704 187122 DEBUG nova.network.neutron [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updating instance_info_cache with network_info: [{"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:39:01 compute-0 nova_compute[187118]: 2025-11-24 14:39:01.723 187122 DEBUG oslo_concurrency.lockutils [req-5cecefa6-2226-4d0e-8f8c-8416dfb18e1b req-9f396cff-8337-4239-94f4-73e1a65d788b 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:39:05 compute-0 nova_compute[187118]: 2025-11-24 14:39:05.020 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:06 compute-0 nova_compute[187118]: 2025-11-24 14:39:06.511 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:08 compute-0 podman[219133]: 2025-11-24 14:39:08.450553302 +0000 UTC m=+0.055974305 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:39:09 compute-0 ovn_controller[95613]: 2025-11-24T14:39:09Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:12:c7:0b 10.100.0.7
Nov 24 14:39:09 compute-0 ovn_controller[95613]: 2025-11-24T14:39:09Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:12:c7:0b 10.100.0.7
Nov 24 14:39:10 compute-0 nova_compute[187118]: 2025-11-24 14:39:10.023 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:11 compute-0 nova_compute[187118]: 2025-11-24 14:39:11.514 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:15 compute-0 nova_compute[187118]: 2025-11-24 14:39:15.026 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:15 compute-0 nova_compute[187118]: 2025-11-24 14:39:15.139 187122 INFO nova.compute.manager [None req-ecb018be-7b29-4ee4-947b-6ae68a557fa0 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Get console output
Nov 24 14:39:15 compute-0 nova_compute[187118]: 2025-11-24 14:39:15.148 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:39:15 compute-0 podman[219170]: 2025-11-24 14:39:15.477314107 +0000 UTC m=+0.073699870 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:39:16 compute-0 nova_compute[187118]: 2025-11-24 14:39:16.517 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:17 compute-0 ovn_controller[95613]: 2025-11-24T14:39:17Z|00161|binding|INFO|Releasing lport ce1a6e8f-888a-4b1b-8b90-c49290e66fa2 from this chassis (sb_readonly=0)
Nov 24 14:39:17 compute-0 nova_compute[187118]: 2025-11-24 14:39:17.483 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:17 compute-0 ovn_controller[95613]: 2025-11-24T14:39:17Z|00162|binding|INFO|Releasing lport ce1a6e8f-888a-4b1b-8b90-c49290e66fa2 from this chassis (sb_readonly=0)
Nov 24 14:39:17 compute-0 nova_compute[187118]: 2025-11-24 14:39:17.556 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:19 compute-0 nova_compute[187118]: 2025-11-24 14:39:19.182 187122 INFO nova.compute.manager [None req-7b957be7-4316-410c-9c04-9b1cd710cdbe ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Get console output
Nov 24 14:39:19 compute-0 nova_compute[187118]: 2025-11-24 14:39:19.189 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:39:19 compute-0 nova_compute[187118]: 2025-11-24 14:39:19.690 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:19 compute-0 NetworkManager[55697]: <info>  [1763995159.6922] manager: (patch-br-int-to-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Nov 24 14:39:19 compute-0 NetworkManager[55697]: <info>  [1763995159.6934] manager: (patch-provnet-4fe4baa8-3d37-4e4d-b444-d465ded6f335-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 24 14:39:19 compute-0 ovn_controller[95613]: 2025-11-24T14:39:19Z|00163|binding|INFO|Releasing lport ce1a6e8f-888a-4b1b-8b90-c49290e66fa2 from this chassis (sb_readonly=0)
Nov 24 14:39:19 compute-0 nova_compute[187118]: 2025-11-24 14:39:19.750 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:19 compute-0 nova_compute[187118]: 2025-11-24 14:39:19.761 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.017 187122 INFO nova.compute.manager [None req-e02a84c0-590e-4d7f-9577-7abf36ce4c3b ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Get console output
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.025 213288 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.028 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.740 187122 DEBUG nova.compute.manager [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-changed-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.740 187122 DEBUG nova.compute.manager [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Refreshing instance network info cache due to event network-changed-1f30b9ed-543c-4644-a445-5d12cae7ae11. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.741 187122 DEBUG oslo_concurrency.lockutils [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.741 187122 DEBUG oslo_concurrency.lockutils [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquired lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.741 187122 DEBUG nova.network.neutron [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Refreshing network info cache for port 1f30b9ed-543c-4644-a445-5d12cae7ae11 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.794 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.794 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.794 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.795 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.795 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.796 187122 INFO nova.compute.manager [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Terminating instance
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.796 187122 DEBUG nova.compute.manager [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Nov 24 14:39:20 compute-0 kernel: tap1f30b9ed-54 (unregistering): left promiscuous mode
Nov 24 14:39:20 compute-0 NetworkManager[55697]: <info>  [1763995160.8208] device (tap1f30b9ed-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.834 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:20 compute-0 ovn_controller[95613]: 2025-11-24T14:39:20Z|00164|binding|INFO|Releasing lport 1f30b9ed-543c-4644-a445-5d12cae7ae11 from this chassis (sb_readonly=0)
Nov 24 14:39:20 compute-0 ovn_controller[95613]: 2025-11-24T14:39:20Z|00165|binding|INFO|Setting lport 1f30b9ed-543c-4644-a445-5d12cae7ae11 down in Southbound
Nov 24 14:39:20 compute-0 ovn_controller[95613]: 2025-11-24T14:39:20Z|00166|binding|INFO|Removing iface tap1f30b9ed-54 ovn-installed in OVS
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.837 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:20 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:20.842 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:12:c7:0b 10.100.0.7'], port_security=['fa:16:3e:12:c7:0b 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'deec7f7a-de1e-4cb1-b74c-f47abc760797', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0b17c7cc946a4f86aea7e5b323e88562', 'neutron:revision_number': '4', 'neutron:security_group_ids': '221f1bcc-2670-4c6d-8839-df20c44d3b24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6fdd0f0e-04a1-4ef2-9a6e-87187e079b44, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>], logical_port=1f30b9ed-543c-4644-a445-5d12cae7ae11) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f3e56694e20>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:39:20 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:20.843 104469 INFO neutron.agent.ovn.metadata.agent [-] Port 1f30b9ed-543c-4644-a445-5d12cae7ae11 in datapath 8b997ab9-49f8-499b-8f6f-e77ce99c144f unbound from our chassis
Nov 24 14:39:20 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:20.844 104469 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b997ab9-49f8-499b-8f6f-e77ce99c144f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Nov 24 14:39:20 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:20.845 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e9e86c61-e03a-4483-900a-c71a0f3bfacc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:20 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:20.846 104469 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f namespace which is not needed anymore
Nov 24 14:39:20 compute-0 nova_compute[187118]: 2025-11-24 14:39:20.857 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:20 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 24 14:39:20 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.902s CPU time.
Nov 24 14:39:20 compute-0 systemd-machined[153483]: Machine qemu-13-instance-0000000d terminated.
Nov 24 14:39:21 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [NOTICE]   (219074) : haproxy version is 2.8.14-c23fe91
Nov 24 14:39:21 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [NOTICE]   (219074) : path to executable is /usr/sbin/haproxy
Nov 24 14:39:21 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [WARNING]  (219074) : Exiting Master process...
Nov 24 14:39:21 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [ALERT]    (219074) : Current worker (219076) exited with code 143 (Terminated)
Nov 24 14:39:21 compute-0 neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f[219070]: [WARNING]  (219074) : All workers exited. Exiting... (0)
Nov 24 14:39:21 compute-0 systemd[1]: libpod-d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b.scope: Deactivated successfully.
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.024 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 podman[219221]: 2025-11-24 14:39:21.026633815 +0000 UTC m=+0.060203491 container died d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.032 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.046 187122 DEBUG nova.compute.manager [req-22165cbf-eef2-44c4-96a3-e586ab0e0730 req-26f50d24-cca8-411e-b606-493806dfa98a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-vif-unplugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.047 187122 DEBUG oslo_concurrency.lockutils [req-22165cbf-eef2-44c4-96a3-e586ab0e0730 req-26f50d24-cca8-411e-b606-493806dfa98a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.048 187122 DEBUG oslo_concurrency.lockutils [req-22165cbf-eef2-44c4-96a3-e586ab0e0730 req-26f50d24-cca8-411e-b606-493806dfa98a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.048 187122 DEBUG oslo_concurrency.lockutils [req-22165cbf-eef2-44c4-96a3-e586ab0e0730 req-26f50d24-cca8-411e-b606-493806dfa98a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.049 187122 DEBUG nova.compute.manager [req-22165cbf-eef2-44c4-96a3-e586ab0e0730 req-26f50d24-cca8-411e-b606-493806dfa98a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] No waiting events found dispatching network-vif-unplugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.049 187122 DEBUG nova.compute.manager [req-22165cbf-eef2-44c4-96a3-e586ab0e0730 req-26f50d24-cca8-411e-b606-493806dfa98a 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-vif-unplugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Nov 24 14:39:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-87241737fbe35adba498c23615e187a8d0d75d90cf39c3186dad47b5e38f3c32-merged.mount: Deactivated successfully.
Nov 24 14:39:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b-userdata-shm.mount: Deactivated successfully.
Nov 24 14:39:21 compute-0 podman[219221]: 2025-11-24 14:39:21.078418853 +0000 UTC m=+0.111988529 container cleanup d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.085 187122 INFO nova.virt.libvirt.driver [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Instance destroyed successfully.
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.086 187122 DEBUG nova.objects.instance [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lazy-loading 'resources' on Instance uuid deec7f7a-de1e-4cb1-b74c-f47abc760797 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Nov 24 14:39:21 compute-0 systemd[1]: libpod-conmon-d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b.scope: Deactivated successfully.
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.097 187122 DEBUG nova.virt.libvirt.vif [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-24T14:38:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2006939476',display_name='tempest-TestNetworkBasicOps-server-2006939476',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2006939476',id=13,image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLAPfeSMXBNT0bS11P2pN5ym+CFCkJn5RROf7lJr7FyNG/zmQHuAnxdmdonsK141KqjY3HQ19kOh/CmvbHf+0yESTfYy3p2uG7QVdhkDKrnlelfdL0HfnuaDNfHjfClKXA==',key_name='tempest-TestNetworkBasicOps-806601726',keypairs=<?>,launch_index=0,launched_at=2025-11-24T14:38:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0b17c7cc946a4f86aea7e5b323e88562',ramdisk_id='',reservation_id='r-mz013yv8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='54a328f6-92ea-410e-beaf-ba04bab9ef9a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-449241238',owner_user_name='tempest-TestNetworkBasicOps-449241238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-24T14:38:57Z,user_data=None,user_id='ef366911f162401f897bcd979ad0c45a',uuid=deec7f7a-de1e-4cb1-b74c-f47abc760797,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.098 187122 DEBUG nova.network.os_vif_util [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converting VIF {"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.099 187122 DEBUG nova.network.os_vif_util [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.099 187122 DEBUG os_vif [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.102 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.102 187122 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f30b9ed-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.103 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.104 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.109 187122 INFO os_vif [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:12:c7:0b,bridge_name='br-int',has_traffic_filtering=True,id=1f30b9ed-543c-4644-a445-5d12cae7ae11,network=Network(8b997ab9-49f8-499b-8f6f-e77ce99c144f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f30b9ed-54')
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.110 187122 INFO nova.virt.libvirt.driver [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Deleting instance files /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797_del
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.111 187122 INFO nova.virt.libvirt.driver [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Deletion of /var/lib/nova/instances/deec7f7a-de1e-4cb1-b74c-f47abc760797_del complete
Nov 24 14:39:21 compute-0 podman[219266]: 2025-11-24 14:39:21.15857759 +0000 UTC m=+0.056564011 container remove d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.164 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[a99f5db8-f057-4864-a856-844a37762202]: (4, ('Mon Nov 24 02:39:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f (d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b)\nd9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b\nMon Nov 24 02:39:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f (d9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b)\nd9284d6f8cea989dd070dc428be64f253b31558227462ffd3e36a15f1436753b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.166 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[8ed03ced-42b5-4d6f-82f6-4b981f031fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.167 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b997ab9-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:39:21 compute-0 kernel: tap8b997ab9-40: left promiscuous mode
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.169 187122 INFO nova.compute.manager [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Took 0.37 seconds to destroy the instance on the hypervisor.
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.171 187122 DEBUG oslo.service.loopingcall [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.171 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.173 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[b27c0a1c-57ee-4edf-ae04-b83bdf5d9ca8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.173 187122 DEBUG nova.compute.manager [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.174 187122 DEBUG nova.network.neutron [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Nov 24 14:39:21 compute-0 nova_compute[187118]: 2025-11-24 14:39:21.185 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.207 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[e7146b22-b572-47b9-8e61-59baadd4aa6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.209 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[84211626-c596-4faf-81c2-6186f0e3a674]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.226 213394 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9695a6-a991-46a3-ab2a-962edd37bd50]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340463, 'reachable_time': 23182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219281, 'error': None, 'target': 'ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:21 compute-0 systemd[1]: run-netns-ovnmeta\x2d8b997ab9\x2d49f8\x2d499b\x2d8f6f\x2de77ce99c144f.mount: Deactivated successfully.
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.229 104584 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b997ab9-49f8-499b-8f6f-e77ce99c144f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Nov 24 14:39:21 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:21.229 104584 DEBUG oslo.privsep.daemon [-] privsep: reply[84b7d63a-3a96-40a7-92f8-6b4a64e1b582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.016 187122 DEBUG nova.network.neutron [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.048 187122 INFO nova.compute.manager [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Took 0.87 seconds to deallocate network for instance.
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.090 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.091 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:22 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.157 187122 DEBUG nova.compute.provider_tree [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.163 187122 DEBUG nova.network.neutron [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updated VIF entry in instance network info cache for port 1f30b9ed-543c-4644-a445-5d12cae7ae11. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.164 187122 DEBUG nova.network.neutron [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Updating instance_info_cache with network_info: [{"id": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "address": "fa:16:3e:12:c7:0b", "network": {"id": "8b997ab9-49f8-499b-8f6f-e77ce99c144f", "bridge": "br-int", "label": "tempest-network-smoke--1118462338", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0b17c7cc946a4f86aea7e5b323e88562", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f30b9ed-54", "ovs_interfaceid": "1f30b9ed-543c-4644-a445-5d12cae7ae11", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.180 187122 DEBUG nova.scheduler.client.report [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.185 187122 DEBUG oslo_concurrency.lockutils [req-59316482-4dfa-49a2-bc8f-3ec385ccfb98 req-fd1b0315-66fb-4579-a850-3e535d0a5a84 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Releasing lock "refresh_cache-deec7f7a-de1e-4cb1-b74c-f47abc760797" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.198 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.228 187122 INFO nova.scheduler.client.report [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Deleted allocations for instance deec7f7a-de1e-4cb1-b74c-f47abc760797
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.275 187122 DEBUG oslo_concurrency.lockutils [None req-32844062-5a8f-4132-9ba9-5f2ef705f4a2 ef366911f162401f897bcd979ad0c45a 0b17c7cc946a4f86aea7e5b323e88562 - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.832 187122 DEBUG nova.compute.manager [req-55550c4d-601f-498e-9793-77cf0935842d req-93b43ed7-e219-4140-b9f2-a5598c16d059 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-vif-deleted-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.833 187122 INFO nova.compute.manager [req-55550c4d-601f-498e-9793-77cf0935842d req-93b43ed7-e219-4140-b9f2-a5598c16d059 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Neutron deleted interface 1f30b9ed-543c-4644-a445-5d12cae7ae11; detaching it from the instance and deleting it from the info cache
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.833 187122 DEBUG nova.network.neutron [req-55550c4d-601f-498e-9793-77cf0935842d req-93b43ed7-e219-4140-b9f2-a5598c16d059 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Nov 24 14:39:22 compute-0 nova_compute[187118]: 2025-11-24 14:39:22.837 187122 DEBUG nova.compute.manager [req-55550c4d-601f-498e-9793-77cf0935842d req-93b43ed7-e219-4140-b9f2-a5598c16d059 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Detach interface failed, port_id=1f30b9ed-543c-4644-a445-5d12cae7ae11, reason: Instance deec7f7a-de1e-4cb1-b74c-f47abc760797 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Nov 24 14:39:23 compute-0 nova_compute[187118]: 2025-11-24 14:39:23.146 187122 DEBUG nova.compute.manager [req-6bebdb3f-6a0a-4d9e-bc2e-a4baf288b943 req-7a64564e-b1ba-47f4-8f5d-8a91a8a96fab 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Nov 24 14:39:23 compute-0 nova_compute[187118]: 2025-11-24 14:39:23.146 187122 DEBUG oslo_concurrency.lockutils [req-6bebdb3f-6a0a-4d9e-bc2e-a4baf288b943 req-7a64564e-b1ba-47f4-8f5d-8a91a8a96fab 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Acquiring lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:23 compute-0 nova_compute[187118]: 2025-11-24 14:39:23.147 187122 DEBUG oslo_concurrency.lockutils [req-6bebdb3f-6a0a-4d9e-bc2e-a4baf288b943 req-7a64564e-b1ba-47f4-8f5d-8a91a8a96fab 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:23 compute-0 nova_compute[187118]: 2025-11-24 14:39:23.147 187122 DEBUG oslo_concurrency.lockutils [req-6bebdb3f-6a0a-4d9e-bc2e-a4baf288b943 req-7a64564e-b1ba-47f4-8f5d-8a91a8a96fab 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] Lock "deec7f7a-de1e-4cb1-b74c-f47abc760797-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:23 compute-0 nova_compute[187118]: 2025-11-24 14:39:23.147 187122 DEBUG nova.compute.manager [req-6bebdb3f-6a0a-4d9e-bc2e-a4baf288b943 req-7a64564e-b1ba-47f4-8f5d-8a91a8a96fab 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] No waiting events found dispatching network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Nov 24 14:39:23 compute-0 nova_compute[187118]: 2025-11-24 14:39:23.148 187122 WARNING nova.compute.manager [req-6bebdb3f-6a0a-4d9e-bc2e-a4baf288b943 req-7a64564e-b1ba-47f4-8f5d-8a91a8a96fab 9110f99fe9454134a24b16b526902e0e 3200104e2668497dac73dac32d5db05a - - default default] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Received unexpected event network-vif-plugged-1f30b9ed-543c-4644-a445-5d12cae7ae11 for instance with vm_state deleted and task_state None.
Nov 24 14:39:23 compute-0 podman[219283]: 2025-11-24 14:39:23.451797528 +0000 UTC m=+0.063624573 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:39:25 compute-0 nova_compute[187118]: 2025-11-24 14:39:25.031 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:26 compute-0 nova_compute[187118]: 2025-11-24 14:39:26.105 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:27 compute-0 nova_compute[187118]: 2025-11-24 14:39:27.433 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:27 compute-0 podman[219305]: 2025-11-24 14:39:27.480585849 +0000 UTC m=+0.080270001 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 14:39:27 compute-0 podman[219304]: 2025-11-24 14:39:27.495528308 +0000 UTC m=+0.087397575 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Nov 24 14:39:27 compute-0 nova_compute[187118]: 2025-11-24 14:39:27.536 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:30 compute-0 nova_compute[187118]: 2025-11-24 14:39:30.033 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:30.347 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:39:30 compute-0 nova_compute[187118]: 2025-11-24 14:39:30.348 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:30 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:30.349 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:39:31 compute-0 nova_compute[187118]: 2025-11-24 14:39:31.107 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:31 compute-0 podman[219344]: 2025-11-24 14:39:31.490143141 +0000 UTC m=+0.081254148 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 24 14:39:31 compute-0 podman[219343]: 2025-11-24 14:39:31.525815778 +0000 UTC m=+0.121348696 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 14:39:35 compute-0 nova_compute[187118]: 2025-11-24 14:39:35.035 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:39:35.135 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:39:36 compute-0 nova_compute[187118]: 2025-11-24 14:39:36.083 187122 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1763995161.0814, deec7f7a-de1e-4cb1-b74c-f47abc760797 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Nov 24 14:39:36 compute-0 nova_compute[187118]: 2025-11-24 14:39:36.083 187122 INFO nova.compute.manager [-] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] VM Stopped (Lifecycle Event)
Nov 24 14:39:36 compute-0 nova_compute[187118]: 2025-11-24 14:39:36.104 187122 DEBUG nova.compute.manager [None req-d1afa67b-bc29-4f54-bf0d-7f92da99893a - - - - - -] [instance: deec7f7a-de1e-4cb1-b74c-f47abc760797] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Nov 24 14:39:36 compute-0 nova_compute[187118]: 2025-11-24 14:39:36.110 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:39 compute-0 podman[219391]: 2025-11-24 14:39:39.484787157 +0000 UTC m=+0.078787680 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:39:39 compute-0 nova_compute[187118]: 2025-11-24 14:39:39.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:40 compute-0 nova_compute[187118]: 2025-11-24 14:39:40.037 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:40 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:40.351 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:39:41 compute-0 nova_compute[187118]: 2025-11-24 14:39:41.112 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.830 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.831 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.831 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:42 compute-0 nova_compute[187118]: 2025-11-24 14:39:42.832 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.033 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.034 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5759MB free_disk=73.45866775512695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.034 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.035 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.240 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.241 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.311 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing inventories for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.378 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating ProviderTree inventory for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.378 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.395 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing aggregate associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.413 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing trait associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, traits: COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AESNI,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.442 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.457 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.486 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.487 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.452s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.488 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.488 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 14:39:43 compute-0 nova_compute[187118]: 2025-11-24 14:39:43.500 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.040 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.502 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.502 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.503 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.515 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:39:45 compute-0 nova_compute[187118]: 2025-11-24 14:39:45.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:46 compute-0 nova_compute[187118]: 2025-11-24 14:39:46.114 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:46 compute-0 podman[219416]: 2025-11-24 14:39:46.475700484 +0000 UTC m=+0.066121500 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:39:47 compute-0 nova_compute[187118]: 2025-11-24 14:39:47.806 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:49 compute-0 nova_compute[187118]: 2025-11-24 14:39:49.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:50 compute-0 nova_compute[187118]: 2025-11-24 14:39:50.042 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:50 compute-0 nova_compute[187118]: 2025-11-24 14:39:50.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:39:50 compute-0 nova_compute[187118]: 2025-11-24 14:39:50.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 14:39:51 compute-0 nova_compute[187118]: 2025-11-24 14:39:51.117 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:54 compute-0 podman[219442]: 2025-11-24 14:39:54.484496879 +0000 UTC m=+0.094122704 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Nov 24 14:39:55 compute-0 nova_compute[187118]: 2025-11-24 14:39:55.043 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:56 compute-0 nova_compute[187118]: 2025-11-24 14:39:56.119 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:39:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:56.665 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:39:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:56.666 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:39:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:39:56.666 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:39:57 compute-0 ovn_controller[95613]: 2025-11-24T14:39:57Z|00167|memory_trim|INFO|Detected inactivity (last active 30016 ms ago): trimming memory
Nov 24 14:39:58 compute-0 podman[219461]: 2025-11-24 14:39:58.454426325 +0000 UTC m=+0.056059417 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:39:58 compute-0 podman[219462]: 2025-11-24 14:39:58.455756642 +0000 UTC m=+0.054310010 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:40:00 compute-0 nova_compute[187118]: 2025-11-24 14:40:00.045 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:01 compute-0 nova_compute[187118]: 2025-11-24 14:40:01.122 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:02 compute-0 podman[219503]: 2025-11-24 14:40:02.47694938 +0000 UTC m=+0.078448726 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 24 14:40:02 compute-0 podman[219502]: 2025-11-24 14:40:02.493565893 +0000 UTC m=+0.104681140 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:40:05 compute-0 nova_compute[187118]: 2025-11-24 14:40:05.046 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:06 compute-0 nova_compute[187118]: 2025-11-24 14:40:06.124 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:10 compute-0 nova_compute[187118]: 2025-11-24 14:40:10.048 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:10 compute-0 podman[219549]: 2025-11-24 14:40:10.94183288 +0000 UTC m=+0.534472059 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:40:11 compute-0 nova_compute[187118]: 2025-11-24 14:40:11.126 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:15 compute-0 nova_compute[187118]: 2025-11-24 14:40:15.049 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:16 compute-0 nova_compute[187118]: 2025-11-24 14:40:16.128 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:17 compute-0 podman[219572]: 2025-11-24 14:40:17.439534536 +0000 UTC m=+0.048083668 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:40:18 compute-0 sshd-session[219597]: Accepted publickey for zuul from 192.168.122.10 port 48088 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:40:18 compute-0 systemd-logind[807]: New session 27 of user zuul.
Nov 24 14:40:18 compute-0 systemd[1]: Started Session 27 of User zuul.
Nov 24 14:40:18 compute-0 sshd-session[219597]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:40:18 compute-0 sudo[219601]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 24 14:40:18 compute-0 sudo[219601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:40:20 compute-0 nova_compute[187118]: 2025-11-24 14:40:20.051 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:21 compute-0 nova_compute[187118]: 2025-11-24 14:40:21.132 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:22 compute-0 ovs-vsctl[219774]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 24 14:40:23 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 219625 (sos)
Nov 24 14:40:23 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 24 14:40:23 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 24 14:40:23 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 24 14:40:24 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 24 14:40:24 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 14:40:24 compute-0 podman[220077]: 2025-11-24 14:40:24.761408315 +0000 UTC m=+0.068215617 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2)
Nov 24 14:40:25 compute-0 nova_compute[187118]: 2025-11-24 14:40:25.053 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:25 compute-0 crontab[220206]: (root) LIST (root)
Nov 24 14:40:26 compute-0 nova_compute[187118]: 2025-11-24 14:40:26.134 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:27 compute-0 systemd[1]: Starting Hostname Service...
Nov 24 14:40:27 compute-0 systemd[1]: Started Hostname Service.
Nov 24 14:40:28 compute-0 podman[220422]: 2025-11-24 14:40:28.799566207 +0000 UTC m=+0.061543316 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:40:28 compute-0 podman[220421]: 2025-11-24 14:40:28.80956938 +0000 UTC m=+0.069958325 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 14:40:30 compute-0 nova_compute[187118]: 2025-11-24 14:40:30.055 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:31 compute-0 nova_compute[187118]: 2025-11-24 14:40:31.137 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:33 compute-0 podman[221063]: 2025-11-24 14:40:33.127965388 +0000 UTC m=+0.073537372 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, version=9.6)
Nov 24 14:40:33 compute-0 podman[221060]: 2025-11-24 14:40:33.153564796 +0000 UTC m=+0.097316520 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118)
Nov 24 14:40:34 compute-0 ovs-appctl[221522]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 14:40:34 compute-0 ovs-appctl[221526]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 14:40:34 compute-0 ovs-appctl[221533]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 24 14:40:35 compute-0 nova_compute[187118]: 2025-11-24 14:40:35.056 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:36 compute-0 nova_compute[187118]: 2025-11-24 14:40:36.139 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:40 compute-0 nova_compute[187118]: 2025-11-24 14:40:40.057 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:40 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 14:40:40 compute-0 nova_compute[187118]: 2025-11-24 14:40:40.823 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:41 compute-0 podman[222862]: 2025-11-24 14:40:41.050559139 +0000 UTC m=+0.062092462 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:40:41 compute-0 nova_compute[187118]: 2025-11-24 14:40:41.141 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:42 compute-0 systemd[1]: Starting Time & Date Service...
Nov 24 14:40:42 compute-0 systemd[1]: Started Time & Date Service.
Nov 24 14:40:42 compute-0 nova_compute[187118]: 2025-11-24 14:40:42.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:42 compute-0 nova_compute[187118]: 2025-11-24 14:40:42.798 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:43 compute-0 nova_compute[187118]: 2025-11-24 14:40:43.792 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.816 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.816 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.816 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.816 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.987 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.988 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5401MB free_disk=73.05043029785156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.988 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:40:44 compute-0 nova_compute[187118]: 2025-11-24 14:40:44.989 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.048 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.048 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.058 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.069 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.083 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.085 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:40:45 compute-0 nova_compute[187118]: 2025-11-24 14:40:45.085 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.085 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.086 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.086 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.143 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:40:46 compute-0 nova_compute[187118]: 2025-11-24 14:40:46.813 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:40:48 compute-0 podman[223027]: 2025-11-24 14:40:48.147471894 +0000 UTC m=+0.070976572 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 24 14:40:48 compute-0 nova_compute[187118]: 2025-11-24 14:40:48.807 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:50 compute-0 nova_compute[187118]: 2025-11-24 14:40:50.059 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:50 compute-0 nova_compute[187118]: 2025-11-24 14:40:50.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:40:51 compute-0 nova_compute[187118]: 2025-11-24 14:40:51.145 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:55 compute-0 nova_compute[187118]: 2025-11-24 14:40:55.062 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:55 compute-0 podman[223050]: 2025-11-24 14:40:55.451883218 +0000 UTC m=+0.063595402 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:40:56 compute-0 nova_compute[187118]: 2025-11-24 14:40:56.147 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:40:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:40:56.666 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:40:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:40:56.667 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:40:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:40:56.667 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:40:59 compute-0 podman[223071]: 2025-11-24 14:40:59.46102257 +0000 UTC m=+0.063218342 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, io.buildah.version=1.41.3)
Nov 24 14:40:59 compute-0 podman[223070]: 2025-11-24 14:40:59.47388927 +0000 UTC m=+0.072963347 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:41:00 compute-0 nova_compute[187118]: 2025-11-24 14:41:00.065 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:01 compute-0 nova_compute[187118]: 2025-11-24 14:41:01.149 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:02 compute-0 sudo[219601]: pam_unix(sudo:session): session closed for user root
Nov 24 14:41:02 compute-0 sshd-session[219600]: Received disconnect from 192.168.122.10 port 48088:11: disconnected by user
Nov 24 14:41:02 compute-0 sshd-session[219600]: Disconnected from user zuul 192.168.122.10 port 48088
Nov 24 14:41:02 compute-0 sshd-session[219597]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:41:02 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Nov 24 14:41:02 compute-0 systemd[1]: session-27.scope: Consumed 1min 12.771s CPU time, 491.5M memory peak, read 101.2M from disk, written 36.6M to disk.
Nov 24 14:41:02 compute-0 systemd-logind[807]: Session 27 logged out. Waiting for processes to exit.
Nov 24 14:41:02 compute-0 systemd-logind[807]: Removed session 27.
Nov 24 14:41:02 compute-0 sshd-session[223110]: Accepted publickey for zuul from 192.168.122.10 port 57126 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:41:02 compute-0 systemd-logind[807]: New session 28 of user zuul.
Nov 24 14:41:02 compute-0 systemd[1]: Started Session 28 of User zuul.
Nov 24 14:41:02 compute-0 sshd-session[223110]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:41:02 compute-0 sudo[223114]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-11-24-qildknv.tar.xz
Nov 24 14:41:02 compute-0 sudo[223114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:41:02 compute-0 sudo[223114]: pam_unix(sudo:session): session closed for user root
Nov 24 14:41:02 compute-0 sshd-session[223113]: Received disconnect from 192.168.122.10 port 57126:11: disconnected by user
Nov 24 14:41:02 compute-0 sshd-session[223113]: Disconnected from user zuul 192.168.122.10 port 57126
Nov 24 14:41:02 compute-0 sshd-session[223110]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:41:02 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Nov 24 14:41:02 compute-0 systemd-logind[807]: Session 28 logged out. Waiting for processes to exit.
Nov 24 14:41:02 compute-0 systemd-logind[807]: Removed session 28.
Nov 24 14:41:02 compute-0 sshd-session[223139]: Accepted publickey for zuul from 192.168.122.10 port 57128 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:41:02 compute-0 systemd-logind[807]: New session 29 of user zuul.
Nov 24 14:41:02 compute-0 systemd[1]: Started Session 29 of User zuul.
Nov 24 14:41:02 compute-0 sshd-session[223139]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:41:02 compute-0 sudo[223143]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Nov 24 14:41:02 compute-0 sudo[223143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:41:02 compute-0 sudo[223143]: pam_unix(sudo:session): session closed for user root
Nov 24 14:41:02 compute-0 sshd-session[223142]: Received disconnect from 192.168.122.10 port 57128:11: disconnected by user
Nov 24 14:41:02 compute-0 sshd-session[223142]: Disconnected from user zuul 192.168.122.10 port 57128
Nov 24 14:41:02 compute-0 sshd-session[223139]: pam_unix(sshd:session): session closed for user zuul
Nov 24 14:41:02 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Nov 24 14:41:02 compute-0 systemd-logind[807]: Session 29 logged out. Waiting for processes to exit.
Nov 24 14:41:02 compute-0 systemd-logind[807]: Removed session 29.
Nov 24 14:41:03 compute-0 podman[223169]: 2025-11-24 14:41:03.460480269 +0000 UTC m=+0.064955008 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 14:41:03 compute-0 podman[223168]: 2025-11-24 14:41:03.502672558 +0000 UTC m=+0.099382846 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 14:41:05 compute-0 nova_compute[187118]: 2025-11-24 14:41:05.069 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:06 compute-0 nova_compute[187118]: 2025-11-24 14:41:06.151 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:10 compute-0 nova_compute[187118]: 2025-11-24 14:41:10.069 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:11 compute-0 nova_compute[187118]: 2025-11-24 14:41:11.155 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:11 compute-0 podman[223214]: 2025-11-24 14:41:11.448551511 +0000 UTC m=+0.062784080 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:41:12 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 24 14:41:12 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 24 14:41:15 compute-0 nova_compute[187118]: 2025-11-24 14:41:15.070 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:16 compute-0 nova_compute[187118]: 2025-11-24 14:41:16.158 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:18 compute-0 podman[223243]: 2025-11-24 14:41:18.463895878 +0000 UTC m=+0.068058523 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:41:20 compute-0 nova_compute[187118]: 2025-11-24 14:41:20.074 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:21 compute-0 nova_compute[187118]: 2025-11-24 14:41:21.160 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:25 compute-0 nova_compute[187118]: 2025-11-24 14:41:25.074 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:26 compute-0 nova_compute[187118]: 2025-11-24 14:41:26.163 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:26 compute-0 podman[223267]: 2025-11-24 14:41:26.48250959 +0000 UTC m=+0.086071213 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 24 14:41:30 compute-0 nova_compute[187118]: 2025-11-24 14:41:30.076 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:30 compute-0 podman[223287]: 2025-11-24 14:41:30.481482565 +0000 UTC m=+0.075378472 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 24 14:41:30 compute-0 podman[223286]: 2025-11-24 14:41:30.493616325 +0000 UTC m=+0.097097214 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:41:31 compute-0 nova_compute[187118]: 2025-11-24 14:41:31.166 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:34 compute-0 podman[223324]: 2025-11-24 14:41:34.476460201 +0000 UTC m=+0.076983476 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 14:41:34 compute-0 podman[223323]: 2025-11-24 14:41:34.492477108 +0000 UTC m=+0.097247408 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS)
Nov 24 14:41:35 compute-0 nova_compute[187118]: 2025-11-24 14:41:35.077 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:41:35.134 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:41:36 compute-0 nova_compute[187118]: 2025-11-24 14:41:36.168 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:40 compute-0 nova_compute[187118]: 2025-11-24 14:41:40.078 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:41 compute-0 nova_compute[187118]: 2025-11-24 14:41:41.171 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:42 compute-0 podman[223370]: 2025-11-24 14:41:42.432569853 +0000 UTC m=+0.046173939 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:41:42 compute-0 nova_compute[187118]: 2025-11-24 14:41:42.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:44 compute-0 nova_compute[187118]: 2025-11-24 14:41:44.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:44 compute-0 nova_compute[187118]: 2025-11-24 14:41:44.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:45 compute-0 nova_compute[187118]: 2025-11-24 14:41:45.080 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.173 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.824 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.824 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.824 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.848 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.848 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.849 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.849 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.989 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.990 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5714MB free_disk=73.4583625793457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.990 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:41:46 compute-0 nova_compute[187118]: 2025-11-24 14:41:46.991 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:41:47 compute-0 nova_compute[187118]: 2025-11-24 14:41:47.061 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:41:47 compute-0 nova_compute[187118]: 2025-11-24 14:41:47.062 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:41:47 compute-0 nova_compute[187118]: 2025-11-24 14:41:47.099 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:41:47 compute-0 nova_compute[187118]: 2025-11-24 14:41:47.121 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:41:47 compute-0 nova_compute[187118]: 2025-11-24 14:41:47.123 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:41:47 compute-0 nova_compute[187118]: 2025-11-24 14:41:47.124 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:41:48 compute-0 nova_compute[187118]: 2025-11-24 14:41:48.096 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:48 compute-0 nova_compute[187118]: 2025-11-24 14:41:48.097 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:41:49 compute-0 podman[223395]: 2025-11-24 14:41:49.459772021 +0000 UTC m=+0.064159128 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:41:50 compute-0 nova_compute[187118]: 2025-11-24 14:41:50.082 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:50 compute-0 nova_compute[187118]: 2025-11-24 14:41:50.791 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:51 compute-0 nova_compute[187118]: 2025-11-24 14:41:51.175 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:51 compute-0 nova_compute[187118]: 2025-11-24 14:41:51.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:41:55 compute-0 nova_compute[187118]: 2025-11-24 14:41:55.085 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:56 compute-0 nova_compute[187118]: 2025-11-24 14:41:56.178 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:41:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:41:56.667 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:41:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:41:56.668 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:41:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:41:56.668 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:41:57 compute-0 podman[223419]: 2025-11-24 14:41:57.468122927 +0000 UTC m=+0.074744825 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:42:00 compute-0 nova_compute[187118]: 2025-11-24 14:42:00.088 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:01 compute-0 nova_compute[187118]: 2025-11-24 14:42:01.182 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:01 compute-0 podman[223438]: 2025-11-24 14:42:01.519185347 +0000 UTC m=+0.115568644 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 24 14:42:01 compute-0 podman[223439]: 2025-11-24 14:42:01.530284823 +0000 UTC m=+0.092311124 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 24 14:42:05 compute-0 nova_compute[187118]: 2025-11-24 14:42:05.090 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:05 compute-0 podman[223479]: 2025-11-24 14:42:05.510753219 +0000 UTC m=+0.097182524 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, version=9.6, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 24 14:42:05 compute-0 podman[223478]: 2025-11-24 14:42:05.536280449 +0000 UTC m=+0.133403809 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:42:06 compute-0 nova_compute[187118]: 2025-11-24 14:42:06.185 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:10 compute-0 nova_compute[187118]: 2025-11-24 14:42:10.092 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:11 compute-0 nova_compute[187118]: 2025-11-24 14:42:11.189 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:13 compute-0 podman[223525]: 2025-11-24 14:42:13.444884915 +0000 UTC m=+0.053076867 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:42:15 compute-0 nova_compute[187118]: 2025-11-24 14:42:15.093 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:16 compute-0 nova_compute[187118]: 2025-11-24 14:42:16.192 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:20 compute-0 nova_compute[187118]: 2025-11-24 14:42:20.094 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:20 compute-0 podman[223550]: 2025-11-24 14:42:20.17677628 +0000 UTC m=+0.057758941 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:42:21 compute-0 nova_compute[187118]: 2025-11-24 14:42:21.199 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:25 compute-0 nova_compute[187118]: 2025-11-24 14:42:25.096 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:26 compute-0 nova_compute[187118]: 2025-11-24 14:42:26.202 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:28 compute-0 podman[223576]: 2025-11-24 14:42:28.523900735 +0000 UTC m=+0.068611153 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:42:30 compute-0 nova_compute[187118]: 2025-11-24 14:42:30.098 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:31 compute-0 nova_compute[187118]: 2025-11-24 14:42:31.206 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:32 compute-0 podman[223595]: 2025-11-24 14:42:32.456440802 +0000 UTC m=+0.055283356 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=multipathd, org.label-schema.build-date=20251118, io.buildah.version=1.41.3)
Nov 24 14:42:32 compute-0 podman[223596]: 2025-11-24 14:42:32.484276605 +0000 UTC m=+0.075216378 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 14:42:35 compute-0 nova_compute[187118]: 2025-11-24 14:42:35.100 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:36 compute-0 nova_compute[187118]: 2025-11-24 14:42:36.209 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:36 compute-0 podman[223636]: 2025-11-24 14:42:36.45613138 +0000 UTC m=+0.049901512 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7)
Nov 24 14:42:36 compute-0 podman[223635]: 2025-11-24 14:42:36.477441239 +0000 UTC m=+0.076390699 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:42:40 compute-0 nova_compute[187118]: 2025-11-24 14:42:40.102 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:41 compute-0 nova_compute[187118]: 2025-11-24 14:42:41.214 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:42 compute-0 nova_compute[187118]: 2025-11-24 14:42:42.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:44 compute-0 podman[223680]: 2025-11-24 14:42:44.475217243 +0000 UTC m=+0.067593014 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:42:44 compute-0 nova_compute[187118]: 2025-11-24 14:42:44.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:44 compute-0 nova_compute[187118]: 2025-11-24 14:42:44.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:45 compute-0 nova_compute[187118]: 2025-11-24 14:42:45.103 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.217 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.828 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.829 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.829 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.830 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.986 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.986 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5741MB free_disk=73.4582290649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.987 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:42:46 compute-0 nova_compute[187118]: 2025-11-24 14:42:46.987 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:42:47 compute-0 nova_compute[187118]: 2025-11-24 14:42:47.037 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:42:47 compute-0 nova_compute[187118]: 2025-11-24 14:42:47.038 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:42:47 compute-0 nova_compute[187118]: 2025-11-24 14:42:47.054 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:42:47 compute-0 nova_compute[187118]: 2025-11-24 14:42:47.063 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:42:47 compute-0 nova_compute[187118]: 2025-11-24 14:42:47.064 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:42:47 compute-0 nova_compute[187118]: 2025-11-24 14:42:47.064 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.058 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.077 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.078 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.078 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.092 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.093 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:48 compute-0 nova_compute[187118]: 2025-11-24 14:42:48.093 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:42:50 compute-0 nova_compute[187118]: 2025-11-24 14:42:50.106 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:50 compute-0 podman[223704]: 2025-11-24 14:42:50.457340177 +0000 UTC m=+0.065141068 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:42:51 compute-0 nova_compute[187118]: 2025-11-24 14:42:51.225 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:51 compute-0 nova_compute[187118]: 2025-11-24 14:42:51.827 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:52 compute-0 nova_compute[187118]: 2025-11-24 14:42:52.802 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:42:55 compute-0 nova_compute[187118]: 2025-11-24 14:42:55.106 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:56 compute-0 nova_compute[187118]: 2025-11-24 14:42:56.229 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:42:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:42:56.669 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:42:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:42:56.669 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:42:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:42:56.669 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:42:59 compute-0 podman[223728]: 2025-11-24 14:42:59.472218425 +0000 UTC m=+0.064788319 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118)
Nov 24 14:43:00 compute-0 nova_compute[187118]: 2025-11-24 14:43:00.110 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:01 compute-0 nova_compute[187118]: 2025-11-24 14:43:01.233 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:03 compute-0 podman[223748]: 2025-11-24 14:43:03.465682088 +0000 UTC m=+0.063101225 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:43:03 compute-0 podman[223747]: 2025-11-24 14:43:03.487618004 +0000 UTC m=+0.086921100 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:43:05 compute-0 nova_compute[187118]: 2025-11-24 14:43:05.114 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:06 compute-0 nova_compute[187118]: 2025-11-24 14:43:06.240 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:07 compute-0 podman[223785]: 2025-11-24 14:43:07.482313028 +0000 UTC m=+0.067541632 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, version=9.6, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 24 14:43:07 compute-0 podman[223784]: 2025-11-24 14:43:07.524186175 +0000 UTC m=+0.116350975 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 24 14:43:10 compute-0 nova_compute[187118]: 2025-11-24 14:43:10.115 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:11 compute-0 nova_compute[187118]: 2025-11-24 14:43:11.243 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:15 compute-0 nova_compute[187118]: 2025-11-24 14:43:15.117 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:15 compute-0 podman[223832]: 2025-11-24 14:43:15.440178649 +0000 UTC m=+0.054300900 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:43:16 compute-0 nova_compute[187118]: 2025-11-24 14:43:16.248 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:20 compute-0 nova_compute[187118]: 2025-11-24 14:43:20.119 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:21 compute-0 nova_compute[187118]: 2025-11-24 14:43:21.250 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:21 compute-0 podman[223856]: 2025-11-24 14:43:21.453183036 +0000 UTC m=+0.061792399 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 24 14:43:25 compute-0 nova_compute[187118]: 2025-11-24 14:43:25.121 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:26 compute-0 nova_compute[187118]: 2025-11-24 14:43:26.252 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:30 compute-0 nova_compute[187118]: 2025-11-24 14:43:30.122 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:30 compute-0 podman[223880]: 2025-11-24 14:43:30.486020874 +0000 UTC m=+0.088034749 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:43:31 compute-0 nova_compute[187118]: 2025-11-24 14:43:31.254 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:34 compute-0 podman[223899]: 2025-11-24 14:43:34.478771519 +0000 UTC m=+0.076181994 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 24 14:43:34 compute-0 podman[223900]: 2025-11-24 14:43:34.492195756 +0000 UTC m=+0.086262391 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:43:35 compute-0 nova_compute[187118]: 2025-11-24 14:43:35.124 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.130 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:43:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:43:36 compute-0 nova_compute[187118]: 2025-11-24 14:43:36.257 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:38 compute-0 podman[223938]: 2025-11-24 14:43:38.465417369 +0000 UTC m=+0.066097365 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 24 14:43:38 compute-0 podman[223937]: 2025-11-24 14:43:38.513926492 +0000 UTC m=+0.117135846 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 24 14:43:40 compute-0 nova_compute[187118]: 2025-11-24 14:43:40.126 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:41 compute-0 nova_compute[187118]: 2025-11-24 14:43:41.260 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:43 compute-0 nova_compute[187118]: 2025-11-24 14:43:43.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:45 compute-0 nova_compute[187118]: 2025-11-24 14:43:45.126 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:45 compute-0 nova_compute[187118]: 2025-11-24 14:43:45.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:46 compute-0 nova_compute[187118]: 2025-11-24 14:43:46.263 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:46 compute-0 podman[223983]: 2025-11-24 14:43:46.444168885 +0000 UTC m=+0.051901096 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:43:46 compute-0 nova_compute[187118]: 2025-11-24 14:43:46.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:46 compute-0 nova_compute[187118]: 2025-11-24 14:43:46.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.796 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.863 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.863 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.964 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.964 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.965 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:43:48 compute-0 nova_compute[187118]: 2025-11-24 14:43:48.965 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.114 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.115 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.4582290649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.115 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.115 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.232 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.232 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.264 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.297 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.298 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:43:49 compute-0 nova_compute[187118]: 2025-11-24 14:43:49.299 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:43:50 compute-0 nova_compute[187118]: 2025-11-24 14:43:50.128 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:50 compute-0 nova_compute[187118]: 2025-11-24 14:43:50.231 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:50 compute-0 nova_compute[187118]: 2025-11-24 14:43:50.232 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:43:51 compute-0 nova_compute[187118]: 2025-11-24 14:43:51.264 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:51 compute-0 nova_compute[187118]: 2025-11-24 14:43:51.617 187122 DEBUG oslo_concurrency.processutils [None req-e8f6eaa8-cd10-47cc-be8e-49cd4cb2b326 a321dadc40f749a2a6578823fcda9ed0 5f2c2c59dcfb47f49d179fade7a63aba - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Nov 24 14:43:51 compute-0 nova_compute[187118]: 2025-11-24 14:43:51.648 187122 DEBUG oslo_concurrency.processutils [None req-e8f6eaa8-cd10-47cc-be8e-49cd4cb2b326 a321dadc40f749a2a6578823fcda9ed0 5f2c2c59dcfb47f49d179fade7a63aba - - default default] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Nov 24 14:43:52 compute-0 podman[224011]: 2025-11-24 14:43:52.43762217 +0000 UTC m=+0.052206903 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:43:52 compute-0 nova_compute[187118]: 2025-11-24 14:43:52.790 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:53 compute-0 nova_compute[187118]: 2025-11-24 14:43:53.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:43:55 compute-0 nova_compute[187118]: 2025-11-24 14:43:55.130 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:43:56.156 104469 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:9d:c1', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'c6:cd:23:07:a9:23'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Nov 24 14:43:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:43:56.157 104469 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Nov 24 14:43:56 compute-0 nova_compute[187118]: 2025-11-24 14:43:56.157 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:56 compute-0 nova_compute[187118]: 2025-11-24 14:43:56.266 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:43:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:43:56.670 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:43:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:43:56.671 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:43:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:43:56.671 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:43:59 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:43:59.159 104469 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=dfd2f9fd-c9ed-4d16-a231-48176f986586, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Nov 24 14:44:00 compute-0 nova_compute[187118]: 2025-11-24 14:44:00.131 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:01 compute-0 nova_compute[187118]: 2025-11-24 14:44:01.269 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:01 compute-0 podman[224036]: 2025-11-24 14:44:01.432983294 +0000 UTC m=+0.048287012 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 24 14:44:05 compute-0 nova_compute[187118]: 2025-11-24 14:44:05.133 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:05 compute-0 podman[224057]: 2025-11-24 14:44:05.452966099 +0000 UTC m=+0.056706988 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 24 14:44:05 compute-0 podman[224056]: 2025-11-24 14:44:05.452966429 +0000 UTC m=+0.066706407 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 24 14:44:06 compute-0 nova_compute[187118]: 2025-11-24 14:44:06.272 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:09 compute-0 podman[224095]: 2025-11-24 14:44:09.455461461 +0000 UTC m=+0.063544453 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 24 14:44:09 compute-0 podman[224094]: 2025-11-24 14:44:09.487505025 +0000 UTC m=+0.096739878 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 24 14:44:10 compute-0 nova_compute[187118]: 2025-11-24 14:44:10.136 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:11 compute-0 nova_compute[187118]: 2025-11-24 14:44:11.273 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:15 compute-0 nova_compute[187118]: 2025-11-24 14:44:15.137 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:16 compute-0 nova_compute[187118]: 2025-11-24 14:44:16.276 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:17 compute-0 podman[224141]: 2025-11-24 14:44:17.466602265 +0000 UTC m=+0.070300265 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:44:20 compute-0 nova_compute[187118]: 2025-11-24 14:44:20.139 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:21 compute-0 nova_compute[187118]: 2025-11-24 14:44:21.279 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:23 compute-0 podman[224167]: 2025-11-24 14:44:23.429356193 +0000 UTC m=+0.043550463 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:44:25 compute-0 nova_compute[187118]: 2025-11-24 14:44:25.139 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:26 compute-0 nova_compute[187118]: 2025-11-24 14:44:26.281 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:30 compute-0 nova_compute[187118]: 2025-11-24 14:44:30.141 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:31 compute-0 nova_compute[187118]: 2025-11-24 14:44:31.283 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:32 compute-0 podman[224191]: 2025-11-24 14:44:32.477802681 +0000 UTC m=+0.082858642 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:44:35 compute-0 nova_compute[187118]: 2025-11-24 14:44:35.142 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:36 compute-0 nova_compute[187118]: 2025-11-24 14:44:36.286 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:36 compute-0 podman[224210]: 2025-11-24 14:44:36.47041399 +0000 UTC m=+0.084542270 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:44:36 compute-0 podman[224211]: 2025-11-24 14:44:36.479380211 +0000 UTC m=+0.086081430 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:44:40 compute-0 nova_compute[187118]: 2025-11-24 14:44:40.142 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:40 compute-0 podman[224250]: 2025-11-24 14:44:40.46724644 +0000 UTC m=+0.080506170 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:44:40 compute-0 podman[224251]: 2025-11-24 14:44:40.468156965 +0000 UTC m=+0.078573698 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 24 14:44:41 compute-0 nova_compute[187118]: 2025-11-24 14:44:41.288 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:44 compute-0 nova_compute[187118]: 2025-11-24 14:44:44.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:45 compute-0 nova_compute[187118]: 2025-11-24 14:44:45.144 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.290 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Nov 24 14:44:46 compute-0 nova_compute[187118]: 2025-11-24 14:44:46.810 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Nov 24 14:44:48 compute-0 podman[224297]: 2025-11-24 14:44:48.480300234 +0000 UTC m=+0.081030754 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.809 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.810 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.810 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.829 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.830 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.864 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.865 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.865 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:44:48 compute-0 nova_compute[187118]: 2025-11-24 14:44:48.866 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.042 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.043 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5752MB free_disk=73.4582290649414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.044 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.044 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.362 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.363 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.451 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing inventories for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.533 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating ProviderTree inventory for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.534 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Updating inventory in ProviderTree for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.563 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing aggregate associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.597 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Refreshing trait associations for resource provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b, traits: COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_FMA3,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_SSE41,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AESNI,COMPUTE_NODE,COMPUTE_VOLUME_EXTEND,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_BMI,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_AVX,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SVM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.633 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.655 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.656 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:44:49 compute-0 nova_compute[187118]: 2025-11-24 14:44:49.657 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.612s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:44:50 compute-0 nova_compute[187118]: 2025-11-24 14:44:50.146 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:50 compute-0 nova_compute[187118]: 2025-11-24 14:44:50.638 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:50 compute-0 nova_compute[187118]: 2025-11-24 14:44:50.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:50 compute-0 nova_compute[187118]: 2025-11-24 14:44:50.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:44:51 compute-0 nova_compute[187118]: 2025-11-24 14:44:51.292 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:53 compute-0 nova_compute[187118]: 2025-11-24 14:44:53.792 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:54 compute-0 podman[224321]: 2025-11-24 14:44:54.512573507 +0000 UTC m=+0.112196843 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:44:55 compute-0 nova_compute[187118]: 2025-11-24 14:44:55.150 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:55 compute-0 nova_compute[187118]: 2025-11-24 14:44:55.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:55 compute-0 nova_compute[187118]: 2025-11-24 14:44:55.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:44:56 compute-0 nova_compute[187118]: 2025-11-24 14:44:56.295 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:44:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:44:56.671 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:44:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:44:56.672 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:44:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:44:56.672 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:45:00 compute-0 nova_compute[187118]: 2025-11-24 14:45:00.149 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:01 compute-0 anacron[30966]: Job `cron.daily' started
Nov 24 14:45:01 compute-0 nova_compute[187118]: 2025-11-24 14:45:01.298 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:01 compute-0 anacron[30966]: Job `cron.daily' terminated
Nov 24 14:45:01 compute-0 nova_compute[187118]: 2025-11-24 14:45:01.808 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:01 compute-0 nova_compute[187118]: 2025-11-24 14:45:01.809 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Nov 24 14:45:03 compute-0 podman[224347]: 2025-11-24 14:45:03.430867795 +0000 UTC m=+0.040534154 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:45:05 compute-0 nova_compute[187118]: 2025-11-24 14:45:05.151 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:06 compute-0 nova_compute[187118]: 2025-11-24 14:45:06.299 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:07 compute-0 podman[224369]: 2025-11-24 14:45:07.463374545 +0000 UTC m=+0.073348017 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true)
Nov 24 14:45:07 compute-0 podman[224368]: 2025-11-24 14:45:07.470189348 +0000 UTC m=+0.079902053 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:45:10 compute-0 nova_compute[187118]: 2025-11-24 14:45:10.154 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:11 compute-0 nova_compute[187118]: 2025-11-24 14:45:11.302 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:11 compute-0 podman[224407]: 2025-11-24 14:45:11.451485775 +0000 UTC m=+0.060174913 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal)
Nov 24 14:45:11 compute-0 podman[224406]: 2025-11-24 14:45:11.477230168 +0000 UTC m=+0.086111181 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251118, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:45:15 compute-0 nova_compute[187118]: 2025-11-24 14:45:15.156 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:16 compute-0 nova_compute[187118]: 2025-11-24 14:45:16.305 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:19 compute-0 podman[224452]: 2025-11-24 14:45:19.476092827 +0000 UTC m=+0.079451481 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 24 14:45:20 compute-0 nova_compute[187118]: 2025-11-24 14:45:20.159 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:21 compute-0 nova_compute[187118]: 2025-11-24 14:45:21.308 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:25 compute-0 nova_compute[187118]: 2025-11-24 14:45:25.161 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:25 compute-0 podman[224477]: 2025-11-24 14:45:25.473068562 +0000 UTC m=+0.062514735 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:45:26 compute-0 nova_compute[187118]: 2025-11-24 14:45:26.310 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:30 compute-0 nova_compute[187118]: 2025-11-24 14:45:30.161 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:31 compute-0 nova_compute[187118]: 2025-11-24 14:45:31.311 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:34 compute-0 podman[224501]: 2025-11-24 14:45:34.447520673 +0000 UTC m=+0.056934715 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:45:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:45:35 compute-0 nova_compute[187118]: 2025-11-24 14:45:35.163 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:36 compute-0 nova_compute[187118]: 2025-11-24 14:45:36.316 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:38 compute-0 podman[224521]: 2025-11-24 14:45:38.456141573 +0000 UTC m=+0.063192623 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:45:38 compute-0 podman[224522]: 2025-11-24 14:45:38.467598981 +0000 UTC m=+0.068854245 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:45:40 compute-0 nova_compute[187118]: 2025-11-24 14:45:40.166 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:41 compute-0 nova_compute[187118]: 2025-11-24 14:45:41.331 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:42 compute-0 podman[224562]: 2025-11-24 14:45:42.471816081 +0000 UTC m=+0.064406896 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public)
Nov 24 14:45:42 compute-0 podman[224561]: 2025-11-24 14:45:42.48552968 +0000 UTC m=+0.088066114 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 24 14:45:44 compute-0 nova_compute[187118]: 2025-11-24 14:45:44.816 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:45 compute-0 nova_compute[187118]: 2025-11-24 14:45:45.171 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:46 compute-0 nova_compute[187118]: 2025-11-24 14:45:46.335 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.755 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.811 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.811 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.811 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.973 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.974 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.974 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:48 compute-0 nova_compute[187118]: 2025-11-24 14:45:48.974 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:49 compute-0 nova_compute[187118]: 2025-11-24 14:45:49.795 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:49 compute-0 nova_compute[187118]: 2025-11-24 14:45:49.818 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:45:49 compute-0 nova_compute[187118]: 2025-11-24 14:45:49.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:45:49 compute-0 nova_compute[187118]: 2025-11-24 14:45:49.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:45:49 compute-0 nova_compute[187118]: 2025-11-24 14:45:49.819 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.006 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.007 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5739MB free_disk=73.45822525024414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.007 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.007 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.079 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.080 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.113 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.132 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.133 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.133 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:45:50 compute-0 nova_compute[187118]: 2025-11-24 14:45:50.173 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:50 compute-0 podman[224610]: 2025-11-24 14:45:50.444405965 +0000 UTC m=+0.055788654 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:45:51 compute-0 nova_compute[187118]: 2025-11-24 14:45:51.337 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:53 compute-0 nova_compute[187118]: 2025-11-24 14:45:53.137 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:53 compute-0 nova_compute[187118]: 2025-11-24 14:45:53.137 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:45:54 compute-0 nova_compute[187118]: 2025-11-24 14:45:54.792 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:55 compute-0 nova_compute[187118]: 2025-11-24 14:45:55.175 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:55 compute-0 nova_compute[187118]: 2025-11-24 14:45:55.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:45:56 compute-0 nova_compute[187118]: 2025-11-24 14:45:56.340 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:45:56 compute-0 podman[224634]: 2025-11-24 14:45:56.458713264 +0000 UTC m=+0.063792130 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:45:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:45:56.672 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:45:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:45:56.673 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:45:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:45:56.673 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:46:00 compute-0 nova_compute[187118]: 2025-11-24 14:46:00.177 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:01 compute-0 nova_compute[187118]: 2025-11-24 14:46:01.342 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:05 compute-0 nova_compute[187118]: 2025-11-24 14:46:05.177 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:05 compute-0 podman[224658]: 2025-11-24 14:46:05.48708836 +0000 UTC m=+0.081554359 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 24 14:46:06 compute-0 nova_compute[187118]: 2025-11-24 14:46:06.360 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:09 compute-0 podman[224677]: 2025-11-24 14:46:09.465025609 +0000 UTC m=+0.072587466 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=multipathd, io.buildah.version=1.41.3)
Nov 24 14:46:09 compute-0 podman[224678]: 2025-11-24 14:46:09.491302367 +0000 UTC m=+0.095240877 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 24 14:46:10 compute-0 nova_compute[187118]: 2025-11-24 14:46:10.180 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:11 compute-0 nova_compute[187118]: 2025-11-24 14:46:11.364 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:13 compute-0 podman[224716]: 2025-11-24 14:46:13.476477764 +0000 UTC m=+0.086403997 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:46:13 compute-0 podman[224717]: 2025-11-24 14:46:13.500129932 +0000 UTC m=+0.094213569 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Nov 24 14:46:15 compute-0 nova_compute[187118]: 2025-11-24 14:46:15.183 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:16 compute-0 nova_compute[187118]: 2025-11-24 14:46:16.367 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:20 compute-0 nova_compute[187118]: 2025-11-24 14:46:20.185 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:21 compute-0 nova_compute[187118]: 2025-11-24 14:46:21.369 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:21 compute-0 podman[224763]: 2025-11-24 14:46:21.48179151 +0000 UTC m=+0.091597079 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:46:25 compute-0 nova_compute[187118]: 2025-11-24 14:46:25.187 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:26 compute-0 nova_compute[187118]: 2025-11-24 14:46:26.379 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:27 compute-0 podman[224787]: 2025-11-24 14:46:27.452574058 +0000 UTC m=+0.055378732 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:46:30 compute-0 nova_compute[187118]: 2025-11-24 14:46:30.189 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:31 compute-0 nova_compute[187118]: 2025-11-24 14:46:31.381 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:35 compute-0 nova_compute[187118]: 2025-11-24 14:46:35.190 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:36 compute-0 nova_compute[187118]: 2025-11-24 14:46:36.384 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:36 compute-0 podman[224812]: 2025-11-24 14:46:36.453343519 +0000 UTC m=+0.062800073 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 24 14:46:40 compute-0 nova_compute[187118]: 2025-11-24 14:46:40.191 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:40 compute-0 podman[224832]: 2025-11-24 14:46:40.446578851 +0000 UTC m=+0.057425489 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0)
Nov 24 14:46:40 compute-0 podman[224831]: 2025-11-24 14:46:40.492468937 +0000 UTC m=+0.095366550 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251118)
Nov 24 14:46:41 compute-0 nova_compute[187118]: 2025-11-24 14:46:41.385 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:44 compute-0 podman[224873]: 2025-11-24 14:46:44.456066152 +0000 UTC m=+0.056960875 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal)
Nov 24 14:46:44 compute-0 podman[224872]: 2025-11-24 14:46:44.478831835 +0000 UTC m=+0.088480374 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 24 14:46:45 compute-0 nova_compute[187118]: 2025-11-24 14:46:45.194 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:45 compute-0 nova_compute[187118]: 2025-11-24 14:46:45.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:46 compute-0 nova_compute[187118]: 2025-11-24 14:46:46.387 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:48 compute-0 nova_compute[187118]: 2025-11-24 14:46:48.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:48 compute-0 nova_compute[187118]: 2025-11-24 14:46:48.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.196 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.792 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.811 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.811 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.811 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.839 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.839 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.839 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.860 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.861 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.861 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:46:50 compute-0 nova_compute[187118]: 2025-11-24 14:46:50.862 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.048 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.050 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5748MB free_disk=73.45822525024414GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.050 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.050 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.126 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.126 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.153 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.175 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.177 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.177 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:46:51 compute-0 nova_compute[187118]: 2025-11-24 14:46:51.390 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:52 compute-0 podman[224921]: 2025-11-24 14:46:52.445450491 +0000 UTC m=+0.057509410 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 24 14:46:55 compute-0 nova_compute[187118]: 2025-11-24 14:46:55.134 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:55 compute-0 nova_compute[187118]: 2025-11-24 14:46:55.134 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:46:55 compute-0 nova_compute[187118]: 2025-11-24 14:46:55.198 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:55 compute-0 nova_compute[187118]: 2025-11-24 14:46:55.790 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:56 compute-0 nova_compute[187118]: 2025-11-24 14:46:56.392 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:46:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:46:56.673 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:46:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:46:56.673 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:46:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:46:56.674 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:46:57 compute-0 nova_compute[187118]: 2025-11-24 14:46:57.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:46:58 compute-0 podman[224945]: 2025-11-24 14:46:58.437406729 +0000 UTC m=+0.048732273 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:47:00 compute-0 nova_compute[187118]: 2025-11-24 14:47:00.200 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:01 compute-0 nova_compute[187118]: 2025-11-24 14:47:01.394 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:01 compute-0 sshd-session[224969]: Connection closed by 3.80.182.148 port 53532 [preauth]
Nov 24 14:47:05 compute-0 nova_compute[187118]: 2025-11-24 14:47:05.202 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:06 compute-0 nova_compute[187118]: 2025-11-24 14:47:06.397 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:07 compute-0 podman[224971]: 2025-11-24 14:47:07.478755103 +0000 UTC m=+0.075176635 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_id=ovn_metadata_agent, org.label-schema.build-date=20251118, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 24 14:47:10 compute-0 nova_compute[187118]: 2025-11-24 14:47:10.204 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:11 compute-0 nova_compute[187118]: 2025-11-24 14:47:11.399 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:11 compute-0 podman[224992]: 2025-11-24 14:47:11.445805562 +0000 UTC m=+0.050977165 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251118, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 24 14:47:11 compute-0 podman[224993]: 2025-11-24 14:47:11.460377325 +0000 UTC m=+0.063139123 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251118, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3)
Nov 24 14:47:15 compute-0 nova_compute[187118]: 2025-11-24 14:47:15.206 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:15 compute-0 podman[225032]: 2025-11-24 14:47:15.468479329 +0000 UTC m=+0.071488387 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Nov 24 14:47:15 compute-0 podman[225031]: 2025-11-24 14:47:15.489526786 +0000 UTC m=+0.099561414 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Nov 24 14:47:16 compute-0 nova_compute[187118]: 2025-11-24 14:47:16.402 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:20 compute-0 nova_compute[187118]: 2025-11-24 14:47:20.208 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:21 compute-0 nova_compute[187118]: 2025-11-24 14:47:21.404 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:23 compute-0 podman[225076]: 2025-11-24 14:47:23.455240285 +0000 UTC m=+0.059106935 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 24 14:47:25 compute-0 nova_compute[187118]: 2025-11-24 14:47:25.211 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:26 compute-0 nova_compute[187118]: 2025-11-24 14:47:26.407 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:29 compute-0 podman[225102]: 2025-11-24 14:47:29.442490956 +0000 UTC m=+0.055705022 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 24 14:47:30 compute-0 nova_compute[187118]: 2025-11-24 14:47:30.212 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:31 compute-0 nova_compute[187118]: 2025-11-24 14:47:31.410 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.131 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 ceilometer_agent_compute[197823]: 2025-11-24 14:47:35.133 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 24 14:47:35 compute-0 nova_compute[187118]: 2025-11-24 14:47:35.214 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:36 compute-0 nova_compute[187118]: 2025-11-24 14:47:36.413 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:38 compute-0 podman[225126]: 2025-11-24 14:47:38.440277048 +0000 UTC m=+0.051474399 container health_status 765b4431ccca8b33ef0dd64471a1274237a038e893fb4b9fa6391aae4f4674b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:a9583cb3baf440d2358ef041373833afbeae60da8159dd031502379901141620', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251118, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 24 14:47:40 compute-0 nova_compute[187118]: 2025-11-24 14:47:40.215 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:41 compute-0 nova_compute[187118]: 2025-11-24 14:47:41.415 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:42 compute-0 podman[225146]: 2025-11-24 14:47:42.439502192 +0000 UTC m=+0.049215647 container health_status f7306f44134b76843cc4b94cf98a43a94677da59488384113305ddf60f03576d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251118, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:9f9f367ed4c85efb16c3a74a4bb707ff0db271d7bc5abc70a71e984b55f43003', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 24 14:47:42 compute-0 podman[225145]: 2025-11-24 14:47:42.4401788 +0000 UTC m=+0.053123201 container health_status 8f7e34690c63db91ccd7b3aac5f42d75782be799dc4b88339a508834291a2a1c (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251118, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:6f86db36d668348be8c5b46dcda8b1fa23d34bfdc07164fbcbe7a6327fb4de24', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Nov 24 14:47:45 compute-0 nova_compute[187118]: 2025-11-24 14:47:45.218 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:45 compute-0 nova_compute[187118]: 2025-11-24 14:47:45.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:46 compute-0 nova_compute[187118]: 2025-11-24 14:47:46.417 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:46 compute-0 podman[225185]: 2025-11-24 14:47:46.478138779 +0000 UTC m=+0.086884802 container health_status 6e9a3d67692913077d2ebd394aea4e261bbfe956d1cc72071b4244991e72f4f1 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=9.6, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Nov 24 14:47:46 compute-0 podman[225184]: 2025-11-24 14:47:46.500657105 +0000 UTC m=+0.101095934 container health_status 48c38f7e1d1daac09868cdd7a83caf51fe983588dd18149ccda5cbfab45111ff (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251118, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:af46761060c7987e1dee5f14c06d85b46f12ad8e09c83d4246ab4e3a65dfda3e', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=7b76510d5d5adf2ccf627d29bb9dae76)
Nov 24 14:47:50 compute-0 nova_compute[187118]: 2025-11-24 14:47:50.220 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:50 compute-0 nova_compute[187118]: 2025-11-24 14:47:50.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:50 compute-0 nova_compute[187118]: 2025-11-24 14:47:50.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:50 compute-0 nova_compute[187118]: 2025-11-24 14:47:50.797 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.419 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.819 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.820 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.952 187122 WARNING nova.virt.libvirt.driver [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.953 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5748MB free_disk=73.45817565917969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.954 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:47:51 compute-0 nova_compute[187118]: 2025-11-24 14:47:51.954 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:47:52 compute-0 nova_compute[187118]: 2025-11-24 14:47:52.014 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Nov 24 14:47:52 compute-0 nova_compute[187118]: 2025-11-24 14:47:52.014 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Nov 24 14:47:52 compute-0 nova_compute[187118]: 2025-11-24 14:47:52.048 187122 DEBUG nova.compute.provider_tree [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed in ProviderTree for provider: 08b6207d-b34e-43d6-b1a7-1741d75aa10b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Nov 24 14:47:52 compute-0 nova_compute[187118]: 2025-11-24 14:47:52.060 187122 DEBUG nova.scheduler.client.report [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Inventory has not changed for provider 08b6207d-b34e-43d6-b1a7-1741d75aa10b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Nov 24 14:47:52 compute-0 nova_compute[187118]: 2025-11-24 14:47:52.062 187122 DEBUG nova.compute.resource_tracker [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Nov 24 14:47:52 compute-0 nova_compute[187118]: 2025-11-24 14:47:52.062 187122 DEBUG oslo_concurrency.lockutils [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:47:52 compute-0 sshd-session[225229]: Accepted publickey for zuul from 192.168.122.10 port 39568 ssh2: ECDSA SHA256:YagaQ06xjDIqlHKDQI/DvfHVh7PpxgsnZuemfyfvrGo
Nov 24 14:47:52 compute-0 systemd-logind[807]: New session 30 of user zuul.
Nov 24 14:47:52 compute-0 systemd[1]: Started Session 30 of User zuul.
Nov 24 14:47:52 compute-0 sshd-session[225229]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Nov 24 14:47:52 compute-0 sudo[225233]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Nov 24 14:47:52 compute-0 sudo[225233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Nov 24 14:47:53 compute-0 nova_compute[187118]: 2025-11-24 14:47:53.063 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:53 compute-0 nova_compute[187118]: 2025-11-24 14:47:53.064 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Nov 24 14:47:53 compute-0 nova_compute[187118]: 2025-11-24 14:47:53.065 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Nov 24 14:47:53 compute-0 nova_compute[187118]: 2025-11-24 14:47:53.078 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Nov 24 14:47:54 compute-0 podman[225350]: 2025-11-24 14:47:54.47921578 +0000 UTC m=+0.086202363 container health_status eb9880f1e2c40cc233bb6429af311bd234189018da2c47cc9c613e91da2c665d (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 24 14:47:54 compute-0 nova_compute[187118]: 2025-11-24 14:47:54.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:54 compute-0 nova_compute[187118]: 2025-11-24 14:47:54.797 187122 DEBUG nova.compute.manager [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Nov 24 14:47:55 compute-0 nova_compute[187118]: 2025-11-24 14:47:55.222 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:55 compute-0 nova_compute[187118]: 2025-11-24 14:47:55.791 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:56 compute-0 nova_compute[187118]: 2025-11-24 14:47:56.422 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:47:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:47:56.674 104469 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Nov 24 14:47:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:47:56.675 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Nov 24 14:47:56 compute-0 ovn_metadata_agent[104464]: 2025-11-24 14:47:56.675 104469 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Nov 24 14:47:56 compute-0 ovs-vsctl[225429]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 24 14:47:57 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 24 14:47:58 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 24 14:47:58 compute-0 virtqemud[186719]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 24 14:47:58 compute-0 nova_compute[187118]: 2025-11-24 14:47:58.796 187122 DEBUG oslo_service.periodic_task [None req-a2c065b9-c7f7-417d-862a-b0565b6f09a0 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Nov 24 14:47:59 compute-0 crontab[225844]: (root) LIST (root)
Nov 24 14:48:00 compute-0 nova_compute[187118]: 2025-11-24 14:48:00.223 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:48:00 compute-0 podman[225926]: 2025-11-24 14:48:00.453881582 +0000 UTC m=+0.059744841 container health_status 14229925ae7a026d2d61a350141d31251c7e5479057b9ced25d6335182636440 (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 24 14:48:01 compute-0 systemd[1]: Starting Hostname Service...
Nov 24 14:48:01 compute-0 systemd[1]: Started Hostname Service.
Nov 24 14:48:01 compute-0 nova_compute[187118]: 2025-11-24 14:48:01.424 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Nov 24 14:48:05 compute-0 nova_compute[187118]: 2025-11-24 14:48:05.226 187122 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
